Digital Foundry: The Last of Us Part 2 PC Review - We're Disappointed - Analysis + Optimised Settings

Det

Member
I'm getting really tired of all of Alex's constant complaining. This is by all means a competent port, yet goes on relentlessly bashes it and says it's disappointing. What?

ok3t9fK.png


The only way to praise a Sony console port for PC is if Sony stops being a PlayStation game developer and becomes a PC developer, making a shitty code for the PS5 and well-made for the PC.

Only if Sony's priority becomes the PC will this idiot stop complaining.

The idiot even complained that Nixxes was making the Horizon Zero Dawn Remaster for PS5 and PC and not just making ports of PS5 games for PC; as if Sony's priority should be PC ports and not a remaster for PS5 and PC.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
You're being too aggressive. It's fine.
Am I? You're the one who started with the disrespectful callout and then went on, "lol, you don't even know the difference between GNM and GNMX" when I clearly explained the reason why I named both

You won't catch me with my pants down saying "PlayStation is just a PC" or "A similarly specced PC should perform the same", because it's wrong.
 

Gaiff

SBI’s Resident Gaslighter
Something in the interview made me curious:

Did you switch to mutex?

Wessel de Groot:
No, because that also has to do with the way that this job system works: it uses fibres. I'd say it's one of the better job systems I've worked with in my career. So it's a nice, well-optimised engine in that regard, which can't really use a mutex due to the way fibres work. So we had to come up with a sort of a different construct for that. I think we managed quite well there.

Travis McIntosh: Part of it is our fault - on PS5, no one cares what the CPU utilisation is. The job system was originally constructed to just always use everything, every second, and so moving that to PC, Nixxes was super helpful in helping to optimise utilisation as people on PC do care about it. It was challenging to reduce that as we never had to worry about it on console.

Not sure what he means by "the job system was originally constructed to just always use everything, every second". Also, I don't think people care all that much about utilization on PC either, I think they just care about good performance. You can read the rest of the interview for a clearer context.
 
Last edited:
Tflops ≠ 'Raw performance'
Obviosly, but usually this is (was, especially back in 2016) in nvidia's favor.

let's look at release year:
PS4 Pro: 2016
RTX 3060: 2021

When the OG PS4 was released you would get roughly its level of performance with a mid-range CPU + a 750ti.

The 1050 ti would have beat it without even breaking a swet... let's say for a minute that it is in the ballpart of the PS4 PRO.

The 3060 is 2 to 3x faster than the 1050 ti! Can you imagine how PC gamers got screwed over the years.
 

Aaron07088

Neo Member
thanks then it should be fine enough for me

I've completed this game 2 times on ps4 and 1 time on ps5 (thanks to my friend who gave his ps4 and ps5 for a while). I just couldn't get used to vsync lag so I had to play with aim assist on PS4. I usually don't use aim assist in games on PC. I still wanted the challenge so I completed it with hard and survivor difficulties.

I was hyped for ps5 remaster, thinking that 60 FPS should give me enough responsiveness to disable aim assist but turns out I wasn't able to get used to vsync lag at 60 fps either. ps5 doesn't support freesync premium so all I had was vsync 30, 40 and 60 FPS modes.

on PC I'm able to play with aim assist disabled at 40-50 FPS usually. so I'm looking forward to play tlou part 2 it on PC again with aim assist disabled on grounded difficulty
I tried a bit today with dlss 4 image clarity is perfect with 1440p but a lot of ghosting issue is here. Dlss 3.7 Preset E suffering a lot of shimmering. Tlou part 1 with dlss wasnt perfect part 2 not perfect too.
 

AFBT88

Member


Seems like disabling Resizable Bar fixes low GPU usage guys. haven't been able to test it myself, but that's the only issue i had with the game in Seattle and other open areas that made my GPU usage go to 50-60 for tens of seconds.
 

Lysandros

Member
Obviosly, but usually this is (was, especially back in 2016) in nvidia's favor.

let's look at release year:
PS4 Pro: 2016
RTX 3060: 2021

When the OG PS4 was released you would get roughly its level of performance with a mid-range CPU + a 750ti.

The 1050 ti would have beat it without even breaking a swet... let's say for a minute that it is in the ballpart of the PS4 PRO.

The 3060 is 2 to 3x faster than the 1050 ti! Can you imagine how PC gamers got screwed over the years.
The statement remains true regardless of AMD vs Nvidia comparisons which only emphasize how unreliable it is to stick this particular GPU metric to deduct overall performance. The most relevant case is the PS5/XSX one. The same base hardware but enough differences in architecture/frequency to render the much wonted metric misleading.

As to PS4 vs 750 ti, this was mostly true for the initial years and using substantially more powerful PC CPUs which skewed the results to some extent. Still, in later years 750ti/i3 combo began to lag behind pretty badly and wasn't enough to match PS4 performance anymore. I also wonder how it would fare running Sony first party titles if those were available on PC at the time, not very brilliantly i would imagine.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Obviosly, but usually this is (was, especially back in 2016) in nvidia's favor.

let's look at release year:
PS4 Pro: 2016
RTX 3060: 2021

When the OG PS4 was released you would get roughly its level of performance with a mid-range CPU + a 750ti.

The 1050 ti would have beat it without even breaking a swet... let's say for a minute that it is in the ballpart of the PS4 PRO.

The 3060 is 2 to 3x faster than the 1050 ti! Can you imagine how PC gamers got screwed over the years.
This was in multiplats and those aren't made the same way PS exclusives are made. That's what the discussion is about. Just how much does exclusivity benefit a single platform when it comes to performance and optimization.

Presumably, this is how some multiplats are made:

OGvWGqX.png


And this is how exclusives are made.

6MXJwf5.png


At the very least one fewer abstraction layer and that's with a low-level API such as DX12. This isn't even getting into the libraries, console-specific shaders, particles systems designed specifically around the PS4, etc.

Now, nobody who even has passive knowledge of game development would argue that given a similar spec, a console will perform better than its PC counterpart if the same game is made specifically for it. The point of contention for TLOUII is that it exhibits an abnormal disparity not seen in games such as HFW or Rift Apart. Interestingly, this difference seems to be lower with more powerful hardware. The 3060 for instance is more than 3.5x the power of the PS4 but it cannot even double its performance for the same settings. The 4080, on the other hand, is 2.5x more powerful than the PS5, but it seems to roughly double its performance.
 
Last edited:

Bojji

Member


Seems like disabling Resizable Bar fixes low GPU usage guys. haven't been able to test it myself, but that's the only issue i had with the game in Seattle and other open areas that made my GPU usage go to 50-60 for tens of seconds.


Holy shit, 2x performance?

fx5gQkh.jpeg


Can anyone confirm?

Edit:

Rebar is disables for this game in nvidia profile. Weird stuff.

iMC0jsf.jpeg
 
Last edited:

yamaci17

Gold Member
The 3060 for instance is more than 3.5x the power of the PS4 but it cannot even double its performance for the same settings. The 4080, on the other hand, is 2.5x more powerful than the PS5, but it seems to roughly double its performance.
based on benchmarks, 4080 is 3x faster than 3060 in this game at 4K
and based on overall benchmarks 4080 is also 3x faster than 3060 at 4K

rtx 3000 gpus are known to scale worse at 1080p compared to 4K. 3060/3070/3080 etc. probably has utilization issues at 1080p in this game. I've seen this many times with 6700xt vs 3070. in ac valhalla for example, 6600xt was faster than my 3070 at 1080p. at 4k, 3070 was faster than 6700xt instead

what I mean is that 3060 would probably perform better at 1440p/4K relative to PS4/PS4 Pro

not saying it would improve the situation much but I don't think it has to do with how much more powerful 4080 is. if it were so, 4080 should've been unusually faster than 3060 at 4K but it actually isn't (at least based on the benchmarks I've seen)

so I guess maybe 3060 users should target 1440p/40 FPS

interestingly this GPU can get 1440p/36+ FPS at 1440p ultra and 45+ FPS at optimized settings



it doesn't seem so bad, at least you would expect much worse performance based on how it performs at 1080p lol
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
based on benchmarks, 4080 is 3x faster than 3060 in this game at 4K
and based on overall benchmarks 4080 is also 3x faster than 3060 at 4K

rtx 3000 gpus are known to scale worse at 1080p compared to 4K. 3060/3070/3080 etc. probably has utilization issues at 1080p in this game. I've seen this many times with 6700xt vs 3070. in ac valhalla for example, 6600xt was faster than my 3070 at 1080p. at 4k, 3070 was faster than 6700xt instead

what I mean is that 3060 would probably perform better at 1440p/4K relative to PS4/PS4 Pro

not saying it would improve the situation much but I don't think it has to do with how much more powerful 4080 is. if it were so, 4080 should've been unusually faster than 3060 at 4K but it actually isn't (at least based on the benchmarks I've seen)

so I guess maybe 3060 users should target 1440p/40 FPS

interestingly this GPU can get 1440p/36+ FPS at 1440p ultra and 45+ FPS at optimized settings



it doesn't seem so bad, at least you would expect much worse performance based on how it performs at 1080p lol

In the same video, he tests 1080p optimized settings, so DF's settings for the PS4. The average is 75 and lows 66. That's significantly more than double PS4's performance.

However, that's with a 13600K, so I think the problem was never the 3060, it was the Ryzen 3600. They're using DirectStorage for CPU decompression, so this could explain why it struggles so much. PS4 evidently isn't using a decompression algorithm like on PC/PS5, so its CPU probably isn't hammered like the 3600, explaining why the 3060 is dragged down by a weaker CPU.

 
Last edited:

Mr Moose

Member
In the same video, he tests 1080p optimized settings, so DF's settings for the PS4. The average is 75 and lows 66. That's significantly more than double PS4's performance.

However, that's with a 13600K, so I think the problem was never the 3060, it was the Ryzen 3600. They're using DirectStorage for CPU decompression, so this could explain why it struggles so much.
I need to upgrade my 3600, even if it's just with a 5600.
crying-jordan.gif
 

Lysandros

Member
In the same video, he tests 1080p optimized settings, so DF's settings for the PS4. The average is 75 and lows 66. That's significantly more than double PS4's performance.

However, that's with a 13600K, so I think the problem was never the 3060, it was the Ryzen 3600. They're using DirectStorage for CPU decompression, so this could explain why it struggles so much. PS4 evidently isn't using a decompression algorithm like on PC/PS5, so its CPU probably isn't hammered like the 3600, explaining why the 3060 is dragged down by a weaker CPU.


That's why PS4 is a very special console, it only runs on hope and good will no algorithms and such needed like PS5/PC. Unburdened Jaguars forever!
 

Gaiff

SBI’s Resident Gaslighter
I need to upgrade my 3600, even if it's just with a 5600.
crying-jordan.gif
And it's in the same area too. Seems I owe Mibu no ookami Mibu no ookami and Mr Moose Mr Moose an apology. They were likely correct in their assessment. The reason the 3600+3060 performs so poorly could be simply down to the game being the PS5 version that leverages the memory subsystem, decompression, and CPU more. Not only is the 13600K much faster, but it's also paired with DDR5 memory.

I'm also glad because these are the same areas.

Z1phL5i.png

KqR1XMa.png


The fact that the graphics are similar doesn't really matter. It's how the result is achieved. If you tried backporting the PS5 version to the PS4, even with lower graphics, the PS4 would likely crumble because its CPU and memory would get slaughtered. Again, this is just guessing, but based on the interview, results, and cross-references, I think it's a good start.
 

analog_future

Resident Crybaby
And it's in the same area too. Seems I owe Mibu no ookami Mibu no ookami and Mr Moose Mr Moose an apology. They were likely correct in their assessment. The reason the 3600+3060 performs so poorly could be simply down to the game being the PS5 version that leverages the memory subsystem, decompression, and CPU more. Not only is the 13600K much faster, but it's also paired with DDR5 memory.

I'm also glad because these are the same areas.

Z1phL5i.png

KqR1XMa.png


The fact that the graphics are similar doesn't really matter. It's how the result is achieved. If you tried backporting the PS5 version to the PS4, even with lower graphics, the PS4 would likely crumble because its CPU and memory would get slaughtered. Again, this is just guessing, but based on the interview, results, and cross-references, I think it's a good start.

So this further confirms that Alex did kind of a shit job in his professional analysis, no?
 

Gaiff

SBI’s Resident Gaslighter
So this further confirms that Alex did kind of a shit job in his professional analysis, no?
Absolutely, he really dropped the ball. I kind of assumed he had tried pairing the 3060 with a faster CPU just for good measure and to rule out the rest of the system, but it turns out he probably didn't. His 5090+9800X3D capping out at 85fps is either a bug or due to ReBar because one guy gets like 190fps at 4K DLSS Quality max settings, and Kryz gets over 130fps at 4K DLAA.


Note: ReBar is On in this video, so I'm not sure why Alex only gets 85fps when it should be 80% higher.

I had assumed he had gone out of his way to isolate the different parts, but he really didn't. He threw a high-end system and a budget system together, made the video, a bunch of assumptions that appear to be incorrect (that I assumed were correct as well), and called it a day.
 
Last edited:

analog_future

Resident Crybaby
Absolutely, he really dropped the ball. I kind of assumed he had tried pairing the 3060 with a faster CPU just for good measure and to rule out the rest of the system, but it turns out he probably didn't. His 5090+9800X3D capping out at 85fps is either a bug or due to ReBar because one guy gets like 190fps at 4K DLSS Quality max settings, and Kryz gets over 130fps at 4K DLAA.


Note: ReBar is On in this video, so I'm not sure why Alex only gets 85fps when it should be 80% higher.

I had assumed he had gone out of his way to isolate the different parts, but he really didn't. He threw a high-end system and a budget system together, made the video, a bunch of assumptions that appear to be incorrect (that I assumed were correct as well), and called it a day.


What a shame. Huge blow to his credibility.
 

Gaiff

SBI’s Resident Gaslighter
What a shame. Huge blow to his credibility.
Part of me thinks it might be deliberate. He's trying to muddy the waters and shift the blame from the CPU and IO to something the devs did wrong with the GPU. He repeatedly mocked the "power of the SSD" on the PS5 and this seems to go against his narrative. If he had used a fast CPU (and entire system, really) and gotten better results, he would have had to admit that the memory+CPU+I/O on a budget PC in no way can keep up with a PS5, validating what he argued against. I also don't believe for a second he wasn't aware of the changes because he sat down with the developers and they explained him the tweaks and improvements on the PS5 version. This included enhancements to the decompression algorithms and data streaming.

Maybe I'm being paranoid, but based on other videos, his methodology was extremely amateurish and we know for a fact he knows better. For instance, instead of blaming the entire system the 3060 was running on, he blamed the GPU alone. That's a no-no because he turned a system benchmark into a GPU benchmark and he knows this.
 
Last edited:

MikeM

Member
I agree that there's overhead, but those numbers are insane. An RTX 3060 incapable of doubling base PS4 performance when it isn't that far from a PS5 is a whole other level of overhead. It's like we're back in the late 90s.
Reminds me of how expensive PS emulation on PC is in general. Given that, not surprised that PC requirements vs PS5 are quite a bit heavier. PC doesn’t have the 1 for 1 subsystems that PS5 has.
 

AFBT88

Member
Absolutely, he really dropped the ball. I kind of assumed he had tried pairing the 3060 with a faster CPU just for good measure and to rule out the rest of the system, but it turns out he probably didn't. His 5090+9800X3D capping out at 85fps is either a bug or due to ReBar because one guy gets like 190fps at 4K DLSS Quality max settings, and Kryz gets over 130fps at 4K DLAA.


Note: ReBar is On in this video, so I'm not sure why Alex only gets 85fps when it should be 80% higher.

I had assumed he had gone out of his way to isolate the different parts, but he really didn't. He threw a high-end system and a budget system together, made the video, a bunch of assumptions that appear to be incorrect (that I assumed were correct as well), and called it a day.

Whatch the video fully, he is having the same low GPU usage problem. Around 6:10 mark.
 

Gaiff

SBI’s Resident Gaslighter
Whatch the video fully, he is having the same low GPU usage problem. Around 6:10 mark.
Thanks, so it's likely a bug with ReBar. I wonder if it has anything to do with PSOs compiling in the background because it doesn't happen in the forest area but in the open section, it does.
 
Last edited:

AFBT88

Member
Thanks, so it's likely a bug with ReBar. I wonder if it has anything to do with PSOs compiling in the background because it doesn't happen in the forest area but in the open section, it does.
Im 99% sure they will add an option to precompile shaders in the menus ala Part 1.
 
Last edited:

Mibu no ookami

Demoted Member® Pro™
The only way to praise a Sony console port for PC is if Sony stops being a PlayStation game developer and becomes a PC developer, making a shitty code for the PS5 and well-made for the PC.

Only if Sony's priority becomes the PC will this idiot stop complaining.

The idiot even complained that Nixxes was making the Horizon Zero Dawn Remaster for PS5 and PC and not just making ports of PS5 games for PC; as if Sony's priority should be PC ports and not a remaster for PS5 and PC.

Think about it. Sony released TLOUP2 on PC for 50 dollars. Alex is mad because they didn't redo the assets and textures and add in RTGI and Path tracing, but had they done all of that and charged 70 dollars for the game, Alex would be mad that they were charging 70 dollars for a port of a game that released in 2020...

As you said, until Sony starts releasing games Day 1 on PC, developed with PC in mind from Day 1, he's always going to shit on Sony and honestly, even if they did that I think he would still shit on them.

Fundamentally, he's a PCMR fanboy who is wildly frustrated by Sony putting out consoles that are better values than PC and taking up much of the discussion space. He'd much rather PC be the talk of the town and he fits in at DF because they're all pretty anti-Sony most of whom have worked for Sega or Xbox branded gaming magazines/sites at one point and feel a certain kind of way towards Sony as a result of perceived slights. It doesn't help that Sony doesn't really give them the type of access they'd like (though I think they give them more than they give most people). PlayStation Productions probably has a stronger tie to the influencer community than PlayStation Studios or SIE.

This is why I warned so heavily that people would target the PS5 Pro, because it was obvious. It was inherently a threat to major groups. PCMR and Xbox. PCMR because video cards were getting more and more expensive and Xbox because Microsoft wasn't releasing their own midgen refresh, and what ramifications that might have for future hardware. Being expensive though, it also brought the wrath of PS fanboys who were angry because they couldn't afford one and don't want to be second class citizens.

DF's coverage of the PS5 Pro has been a sight to be seen in biased "journalism." So obviously Alex is going to be furious when the PS5 Pro isn't easily cleared by PC with this port.

Gaiff Gaiff I used to really respect you and then things got weird, but I'll always respect any man who admits when he was wrong or made a mistake. We're all going to be wrong about something eventually. Takes someone who is real to step outside themselves and say, "I got this wrong this time." So regardless of recent history, you've gained a lot of respect and kudos from me.

Let's also note that once Gaiff started saying things contrary to what he was saying before, Bojji disappeared completely from this thread.
 
DF Forgot to test the real culprit.
"TLOU Part 2 Remastered seems to require over six CPU cores/threads. The game was completely unplayable, due to extreme stutters, on our simulated dual-core and quad-core systems. And although our framerates were higher than 70FPS on our hexa-core CPU, there were numerous stutters. Those stutters were significantly reduced once we moved to a CPU with eight cores."
 

Mibu no ookami

Demoted Member® Pro™
DF Forgot to test the real culprit.
"TLOU Part 2 Remastered seems to require over six CPU cores/threads. The game was completely unplayable, due to extreme stutters, on our simulated dual-core and quad-core systems. And although our framerates were higher than 70FPS on our hexa-core CPU, there were numerous stutters. Those stutters were significantly reduced once we moved to a CPU with eight cores."

What do you think the odds are that DF puts out a revised analysis? Yeah, no chance.
 

mansoor1980

Member
nephew is playing it on ryzen 3600 paired with rx6600 , mix of high and medium settings at 1080p and it runs at a locked 60fps , looks brilliant
great port
 

SKYF@ll

Member
Something in the interview made me curious:



Not sure what he means by "the job system was originally constructed to just always use everything, every second". Also, I don't think people care all that much about utilization on PC either, I think they just care about good performance. You can read the rest of the interview for a clearer context.
It's based on streaming. So whatever new stuff is streamed in, then the PSO compiles start.
Actually, the system lends itself very well to DirectStorage. We're just using CPU decompression, without GPU decompression.
Certainly using async compute with the PS5, where you know exactly what the hardware is and what things pair well together, and there's less driver in the middle, we've always found it to be a lot more beneficial on consoles than it is on PC, unfortunately.
One thing is the spin locking. That is cheap on the console, but on Windows, that can be very problematic for performance.




It's an interesting article that discusses the strengths and weaknesses of the engine on consoles and PC.
PS5's strengths: single shader/material permutations, shared memory pool, faster async compute, cheaper CPU multi threading cost(with Kraken decompression + DMA , Tempest 3d Audio)
https://www.eurogamer.net/digitalfoundry-2025-the-last-of-us-part-2-tech-interview
 
I tested the game on my PC (7800X3D + 4080S) and this is definitely not a good port. Some graphical effects are clearly glitched / missing compared to the PS4 version and the image quality is very poor due to excessive sharpening. TAA hides this sharpening filter (because of the strong TAA blur), but DLSS4 shows it clearly. I had to run the game at 2880p and downscale the image to 1440p to finally be happy with the image quality. The image quality in Part 1 was definitely better (even TAA looked better).

The game is playable on my PC, but that's not surprising given the gap in GPU power. My 4080S is OC'ed to 60TF and that's around 32x GPU power of the PS4. I get around 120fps on average at 1440p TAA native and around 140-160fps with DLSSQ, but performance can dip sometimes to 80fps and I can see stutters when slowly panning the camera. I'm guessing CPU is decompressing something in the background. I would need to lock framerate to 60fps to get smooth experience (without sutters) all of the time. Performance scaling with DLSS FG is also strange, FGx2 improved framerate from 120fps to just 140fps and that means the base frame rate with FGx2 dropped from 120fps to just 70fps.

The RTX3060 (6x GPU power compared to PS4) used to run PS4 ports like a dream, but in TLOU2 this GPU is only good for around 50fps at 1080p with DLSS Quality (so not even native 1080p).

9h04DYQ.jpeg


I could understand why the TLOU1 remake was very demanding on PC. It was a PS5 game that took full advantage of the decompression chip built into this console. TLOU2 is however PS4 game. This his how PS4 ports should run on PC on the RTX3060. 1440p instead of 1080p, much higher framerate, and much higher settings on top of that.

death-stranding-2560-1440.png


detroit-become-human-2560-1440.png



The PS4 could barely run the witcher 3, the graphics was downgraded as hell, yet the RTX3060 had no problems running this game with maxed out settings.

the-witcher-3-2560-1440.png
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I tested the game on my PC (7800X3D + 4080S) and this is definitely not a good port. Some graphics effects are clearly glitched / missing compared to PS4 version and the image quality is very poor because of excessive sharpening. TAA hides that sharpening filter (because of strong TAA blur) but DLSS4 clearly reveals it. I had to run this game at 2880p and downscale the image to 1440p to be finally happy with the image quality. The image quality in Part 1 was definitely better (even TAA looked better).
This is definitely something worth discussing.
The game is playable on my PC, but that's not surprising given the gap in GPU power. My 4080S is OC'ed to 60TF and that's around 32x GPU power of the PS4. I get around 120fps on average at 1440p TAA native and around 140-160fps with DLSSQ, but performance can dip sometimes to 80fps and I can see stutters when slowly panning the camera. I'm guessing CPU is decompressing something in the background. Performance scaling with DLSS FG is also strange, FGx2 improved my from 120fps to just 140fps and that means the base framerate with FGx2 was just 70fps.
And this doesn't make sense. Your 4080S isn't 32x the GPU performance of a PS4. What did we say about using TFLOPs to determine total GPU horsepower?

Justin Timberlake What GIF


32x the GPU power of the PS4. You don’t seriously believe that. It's also incorrect within this context anyway. Chop that in half and you're much closer to reality. Your drops to 80fps could be caused by some fuckery with ReBar. Not 100% sure, but this is somewhat addressed here:


The RTX3060 used to run PS4 ports like a dream (6x GPU power compared to PS4), but in TLOU2 framerare is just around 50fps even at 1080p with DLSS Quality.

I could understand why the TLOU1 remake was very demanding on PC. It was a PS5 game that took full advantage of the decompression chip built into this console. TLOU2 is however PS4 game. This his how PS4 ports should run on PC on the RTX3060. 1440p instead of 1080p, much higher framerate, and much higher settings on top of that.
Except based on the interview, that's very likely exactly what is going on. The PS4 version running on PS5 does not, but the PS5 port does. They re-engineered several parts of the game and there's a lot more data streaming going on and this isn't handled by the CPU. As to why they did that when it was fine on the PS4, I'm not sure? Is there more pop-in or significantly worse LOD on the PS4? It could be one reason.
The PS4 could barely run the witcher 3, the graphics was downgraded as hell, yet the RTX3060 had no problems running this game with maxed out settings.
The clip you posted is max settings and in this game, it's quite a bit more demanding than PS4 optimized settings. The video actually tests "optimized" settings and they are very close to what DF recommends for the PS4 Pro.

KqR1XMa.png


~75fps in one of the heaviest areas in the game with DLAA that costs about a (I think) ~10% performance penalty compared to TAA used on PS4. You're probably looking at around ~2.5-2.8x better performance than on a PS4. It's significantly lower than the expected results based on specs alone, but Alex was completely incorrect that the 3060 was responsible for the poor performance. It was likely the 3600 and the rest of the system dragging it down. For someone with such a platform, not validating those finds is unacceptable. He should have tested the 3600 on a more powerful system to confirm whether or not it was the limiting factor. If it wasn't, then this means an entire budget build with a poor Zen 2 won't be able to double your performance with an RTX 3060. You need a relatively decent CPU to respond to the memory/IO/CPU demands.

The Witcher 3 is a different animal. It's a PC first game. TLOU Part I and Part II leverage the PS5 in unique ways and I think in certain respects, they kind of expose the inefficiencies of the PC architecture, especially when it comes to decompression and data streaming. Do I think the port is as good as it can be? No, but it seems decent enough. There's only so much you can ask of a team to completely redo a game and port it to a totally different system in 15-16 months.

It doesn't help that Sony doesn't really give them the type of access they'd like (though I think they give them more than they give most people). PlayStation Productions probably has a stronger tie to the influencer community than PlayStation Studios or SIE.
Don't they? They interviewed Nixxes more than any other studio. They sat down with ND people for both TLOU ports. They chatted with Jetpack Interactive tasked with the GOWR port. They got behind the scenes footage of the Pro and got to talk with Polyphony and other third-party devs like Codemasters. They obtained exclusive access and personally interviewed Mark Cerny for half an hour. PlayStation has probably given them more exclusive access than Xbox the past year or so. I don't think they can get much more than this unless they want a tour of the facilities with the production lines or something.
 
Last edited:

yamaci17

Gold Member
~75fps in one of the heaviest areas in the game with DLAA that costs about a (I think) ~10% performance penalty compared to TAA used on PS4. You're probably looking at around ~2.5-2.8x better performance than on a PS4.
there should be another 5% performance penalty from nvidia reflex (since it reduces GPU bound latency by reducing GPU utilization a bit)

and of course thanks to that, you have 1.5x-2x more responsive experience at similar framerates over Vsync modes on consoles so it is worth to enable it (irrelevant I know but just wanted to point out)

so someone playing at ultra settings at 50 fps will still get more responsive experience than the locked 60 fps vsync mode on PS5

if ps5 supported freesync premium and more games supported VRR unlocked modes, I wouldn't care much about this though. I'm not going to get a new screen just because ps5 needs so. ac shadows for example does not have VRR unlocked modes. why? no one knows
 
Last edited:

TrebleShot

Member
Runs great on my 7800x3d / 5090 but it bloody should do.

More surprisingly runs great and looks great on the Ally X.
 

SKYF@ll

Member
there should be another 5% performance penalty from nvidia reflex (since it reduces GPU bound latency by reducing GPU utilization a bit)

and of course thanks to that, you have 1.5x-2x more responsive experience at similar framerates over Vsync modes on consoles so it is worth to enable it (irrelevant I know but just wanted to point out)

so someone playing at ultra settings at 50 fps will still get more responsive experience than the locked 60 fps vsync mode on PS5

if ps5 supported freesync premium and more games supported VRR unlocked modes, I wouldn't care much about this though. I'm not going to get a new screen just because ps5 needs so. ac shadows for example does not have VRR unlocked modes. why? no one knows
If you're concerned about input lag, simply enable VRR on the PS5(Pro).
It's unfortunate that Sony won't allow freesync output on the PS5, but those who purchase a PS5-compatible HDMI 2.1 TV will reap the benefits.

PS5 Perf Mode: 1440p / 75-100fps
PS5 Pro Perf Mode: 1440p / 100-120fps
PS5 Pro Mode: PSSR 4K (1440p) / 70-80fps
 

yamaci17

Gold Member
If you're concerned about input lag, simply enable VRR on the PS5(Pro).
It's unfortunate that Sony won't allow freesync output on the PS5, but those who purchase a PS5-compatible HDMI 2.1 TV will reap the benefits.

PS5 Perf Mode: 1440p / 75-100fps
PS5 Pro Perf Mode: 1440p / 100-120fps
PS5 Pro Mode: PSSR 4K (1440p) / 70-80fps
I get those benefits with a freesync premium screen in all games
that's the problem
I'm not saying 30/60 fps vsync on consoles are unplayable anyways. I'm just saying that you get a more responsive experience, that's all. that's not to say ps5 is not responsive or anything, it is just a bit difficult for me to get used to but that is on me

3060 is a budget GPU. if someone can get themselves a hdmi 2.1 TV, chances are they wouldn't have a 3060 to begin with. most people who have GPUs similar to 3060 often have a freesync screen. and they just get a more responsive experience without having to get a new screen. I hope I'm being clear
 
32x the GPU power of the PS4. You don’t seriously believe that.

Justin Timberlake What GIF


Why are you still using FLOPs as a measure of total GPU performance?
The FLOPS difference does not always translate into an exact difference in frame rate, but this metric does give an idea of GPU shading performance, especially if the GPU architecture is the same.

The PS4 GPU is based on the GCN 1.1 architecture and has 1.84TF (specs somewhere between Radeon 7870 and 7850). The RTX3060 chip is based on a totally different (ampere) architecture and has 12.74TF (that's 6.9x flops difference). According to the techpowerup benchmark from The Witcher 3, the RTX3060 pushed around 315 million pixels (1440p 86fps) which is 5x more pixels than the PS4's (62 million pixels 1080p 30fps). Taking into account that the RTX3060 was also running the witcher 3 at max settings, while the PS4 was running at medium/high settings, I would say that the real performance difference was much closer to the theoretical flops difference of 6.9x.

On the screenshot you posted the RTX3060 has 75fps at 1080p with optimized settings. I doubt these "optimized" settings are comparable to the base PS4 (I saw missing details even at maxed out settings compared to my PS4 version), but let's assume that we are looking at PS4 like settings. The PS4 version probably averages around 35-40fps without framecap (if the game maintains 30fps for 99.9% of the time, it needs to run at a much higher framerate than 30fps), so we are looking at 2x difference in framerate betweem PS4 GPU 35-40fps vs RTX3060 75fps. I dont think this is a good scaling when we consider 6.9x difference in shading power between the PS4 and RTX3060.

As for my RTX4080S, it has around 60TF (depending on the GPU clocks). That's 32x difference in shading power compared to PS4 (1.84TF), but Ada Lovelace is a completely different architecture. Even with the Ampere, the shading power of Ada architecture does not scale in a linear way. Ada Lovelace shader cores are slower compared to Ampere, but they are way more power efficient. That being said, I see around 21x PS4 scaling at 1080p (without DLSS) and the settings I used (high draw distance) were even higher compared to PS4, because you cant match PS4 settings exactly without ini tweaks.


witcher3-2025-04-06-12-19-52-859.jpg


witcher3-2025-04-06-12-29-24-994.jpg


witcher3-2025-04-06-12-31-28-620.jpg


So based on this comparison let's assume my RTX4080S is only 21x faster than the PS4 (instead of 32x in theory).

My RTX4080S can run TLOU2 in 1440p at 120fps. This translates into 440 million pixels, so we're looking at a 7x performance difference compared to the PS4 (62 million pixels), meaning my RTX4080S in this particular game has comparable performance to 12,8TF GCN 1.1 architecture :D. You're right, there's nothing wrong with this port, we should find an excuse for everything and just enjoy playing this awesome PS4 game at 60fps.

Why 60fps? My CPU (7800X3D) is sometimes decompressing data in the background and framerate can dip to around 80fps. In order to avoid the stuttering I need to lock framerate at 60fps. Terra Ware noticed the same problem in this video on his 9800X3D CPU and RTX4090, so he also locked the framerate to 60fps:



Sony once said that they need 8TF GPU to run PS4 games at 4K native but games like TLOU2 suggest it actually takes 60TF to do that :D. I get around 60fps at 4K native in this game.

I dont know why you guys are defending this port so much, but I'm not happy with performance. I know my PC can run PS4 era games a lot better, for example RE3 Remake runs at 150-200fps at 4K native with maxed out settings (including RT) on my PC.
 
Last edited:

Bojji

Member
The FLOPS difference does not always translate into an exact difference in frame rate, but this metric does give an idea of GPU shading performance, especially if the GPU architecture is the same.

The PS4 GPU is based on the GCN 1.1 architecture and has 1.84TF (specs somewhere between Radeon 7870 and 7850). The RTX3060 chip is based on a totally different (ampere) architecture and has 12.74TF (that's 6.9x flops difference). According to the techpowerup benchmark from The Witcher 3, the RTX3060 pushed around 315 million pixels (1440p 86fps) which is 5x more pixels than the PS4's (62 million pixels 1080p 30fps). Taking into account that the RTX3060 was also running the witcher 3 at max settings, while the PS4 was running at medium/high settings, I would say that the real performance difference was much closer to the theoretical flops difference of 6.9x.

On the screenshot you posted the RTX3060 has 75fps at 1080p with optimized settings. I doubt these "optimized" settings are comparable to the base PS4 (I saw missing details even at maxed out settings compared to my PS4 version), but let's assume that's the case. The PS4 version probably averages around 35-40fps without framecap (if the game maintains 30fps for 99.9% of the time, it needs to run at a much higher framerate than locked 30fps), so we are looking at 2x difference in framerate betweem PS4 GPU 35-40fps vs RTX3060 75fps. I dont think this is a good scaling when we consider 6.9x difference in shading power between the PS4 and RTX3060.

As for my RTX4080S, it has around 60TF (depending on the GPU clocks). That's 32x difference in shading power compared to PS4 (1.84TF), but Ada Lovelace is a completely different architecture. Even with Ampere, shading performance does not scale linearly (Ada Lovelace shader cores are slower compared to Ampere, but they are way more power efficient). That being said, I see around 21x PS4 scaling at 1080p (without DLSS) and the settings I used (high draw distance) were even higher compared to PS4, because you cant match PS4 settings exactly without ini tweaks.


witcher3-2025-04-06-12-19-52-859.jpg


witcher3-2025-04-06-12-29-24-994.jpg


witcher3-2025-04-06-12-31-28-620.jpg


So based on this comparison let's assume my RTX4080S is only 21x faster than the PS4 (instead of 32x in theory).

My RTX4080S can run TLOU2 in 1440p at 120fps. This translates into 440 million pixels, so we're looking at a 7x difference compared to the PS4 (62 million pixels), meaning my RTX4080S in this particular game has comparable performance to 12,8TF GCN 1.1 architecture :D. You're right, there's nothing wrong with this port, we should find an excuse for everything and just enjoy playing this PS4 game at 60fps.

Why 60fps? My CPU (7800X3D) is sometimes decompressing data in the background and framerate can dip to around 80fps. In order to avoid the stuttering I need to lock framerate at 60fps. Terra Ware noticed the same problem in this video on his 9800X3D CPU and RTX4090, so he also locked the framerate to 60fps:



Sony once said that you need 8TF GPU to run PS4 games at 4K native but games like TLOU2 suggest it actually takes 60TF to do that :D. I get around 60fps at 4K native in this game.

I dont know why you guys are defending this port so much, but I'm not happy with performance. I know my PC can run PS4 era games a lot better, for example RE3 Remake runs at 150-200fps at 4K native with maxed out settings (including RT) on my PC.



Yep. This port is a joke.

IF this game was ported from PS4 version then CPU requirements should be super small, game does not use any decompression hardware on PS4 (it doesn't exist) - just jaguar 1.6GHz. CPU demands is the only difference between the two, GPU requirements should be close between PS4 port and PS5 port. PS5 version runs pretty much the same as PS4 version on PS5 (maybe faster thanks to direct access to RDNA IPC).

Without direct storage PS5 port should have the same CPU requirements as PS4 port. Game just runs much slower than it should and that's it.
 

Gaiff

SBI’s Resident Gaslighter
I dont know why you guys are defending this port so much, but I'm not happy with performance. I know my PC can run PS4 era games a lot better, for example RE3 Remake runs at 150-200fps at 4K native with maxed out settings (including RT) on my PC.
Where exactly do you see me defending this port? I said repeatedly that the GPU scaling is whack relative to PlayStation. What I take issue with is Alex's analysis that came to the wrong conclusions regarding where the deficiencies are.

I never said there's nothing wrong with this port...I even highlighted where some of the problems are whereas DF seems to have missed the mark. As for the TFLOPs analysis you've been going on about, it's nonsensical.

Here:

2ZIUUab.png


6700 is roughly equivalent to the PS5 GPU. 4080S in general 2.5x faster. You seem to be getting around double the performance. That's significantly below for a variety of reasons, but your point about your GPU being 60 TFLOPs is wholly irrelevant.

The best I said about this port is that it's decent and there's only so much they could have done in 15-16 months to get it up to speed on PC.
 
Last edited:

Mibu no ookami

Demoted Member® Pro™
Don't they? They interviewed Nixxes more than any other studio. They sat down with ND people for both TLOU ports. They chatted with Jetpack Interactive tasked with the GOWR port. They got behind the scenes footage of the Pro and got to talk with Polyphony and other third-party devs like Codemasters. They obtained exclusive access and personally interviewed Mark Cerny for half an hour. PlayStation has probably given them more exclusive access than Xbox the past year or so. I don't think they can get much more than this unless they want a tour of the facilities with the production lines or something.

That's actually my point. I think they get "enough" access, but it is definitely less than they get from other companies and I also think it's why Sony went with CNET over DF for the PS5 Pro reveal.You can tell that they're butthurt over it when they don't have answers to questions. Most news filters through leakers rather than official channels like DF when it comes to Sony.

Sony will give DF about enough access as they want in order to promote relations with the community, but there is definitely an arms-length type situation with them. Sony's just not that type of company, they're pretty secretive and opaque.

DF is also heavily sponsored by Nvidia and has very close ties to Microsoft.
 

spons

Gold Member
Judging from the benchmarks this would run fine at 1080p on my 5600x/RX 7700 XT. That's a mid-range shitbox at this point (or any point in time really).
I don't see the problem.
 

Bojji

Member
Judging from the benchmarks this would run fine at 1080p on my 5600x/RX 7700 XT. That's a mid-range shitbox at this point (or any point in time really).
I don't see the problem.

Based on system requirements it should run 1440p/60fps.

Sharpening forced by ND is horrible, game needs mod to remove it:

1.png


I see it in PS4 version played on PS5 as well (but they amplified it in PS5/PC port)...
 
Last edited:

kevboard

Member
Judging from the benchmarks this would run fine at 1080p on my 5600x/RX 7700 XT. That's a mid-range shitbox at this point (or any point in time really).
I don't see the problem.

because this should run at 1080p 60fps on a Ryzen 2400 and a GTX1060 given that it doesn't look any better on the recommendet settings than the original PS4 version... that's the problem.
 
Last edited:
Where exactly do you see me defending this port? I said repeatedly that the GPU scaling is whack relative to PlayStation. What I take issue with is Alex's analysis that came to the wrong conclusions regarding where the deficiencies are.

I never said there's nothing wrong with this port...I even highlighted where some of the problems are whereas DF seems to have missed the mark. As for the TFLOPs analysis you've been going on about, it's nonsensical.

Here:

2ZIUUab.png


6700 is roughly equivalent to the PS5 GPU. It's in general 2.5x faster. You seem to be getting around double the performance. That's significantly below for a variety of reasons, but your point about your GPU being 60 TFLOPs is wholly irrelevant.

The best I said about this port is that it's decent and there's only so much they could have done in 15-16 months to get it up to speed on PC.
Maybe there's a language barrier between us, but to me "decent" means something good, and from what I've seen, this port is far from good, and even pretty fu$% far from OK.


BopjKGH.jpeg



Glitches, missing effects, excessive sharpening, poor DLSS FG scaling, very high requirements relative to PS4 hardware and the game needs to run with 60fps lock in order to avoid stutters on high end CPUs. John from DSOG suggest that the game stutters even more on 6-core CPUs. But it could always be worse, right? In this case, let's just be happy that the game runs at all on PC :messenger_winking_tongue:.

I don't want to compare results on my PC to PS5 because TLOU2 is not a game built for PS5 hardware. TLOU2 was only ported to the PS5, but that doesn't mean I should ignore the fact that the same game runs on the PS4. This fact alone tells me that it should be possible to port TLOU2 to the PC with much better results, if only the developers put more effort and care into the port. 16 months is a lot of time for a group of talented developers. And it's not like the PS4 architecture required them to rebuild the entire engine from scratch just to run the game on PC. What we have is a port of a PS5 port and that's the reason why TLOU2 on PC has PS5 like requirements.

You said GPU power comparison based on flops metric made no sense, yet I showed examples of PS4 ports running like a dream even on the RTX3060. The RTX3060 has around 7x the shading power of the PS4 GPU and could easily run very demanding PS4 ports such as Detroit Become Human. With maxed out settings Detroit Become Himan looked a lot better compared to PS4 version that run at 25-30fps, yet the RTX3060 was still pushing 5x as many pixels (1440p 80fps). The difference in performance was close to the theoretical difference in shading performance, proving that the flops metric is not as useless as you try to make it out to be.
 
Last edited:
Top Bottom