Digital Foundry: The Last of Us Part 2 PC Review - We're Disappointed - Analysis + Optimised Settings

Clear

CliffyB's Cock Holster
That's actually my point. I think they get "enough" access, but it is definitely less than they get from other companies and I also think it's why Sony went with CNET over DF for the PS5 Pro reveal.You can tell that they're butthurt over it when they don't have answers to questions. Most news filters through leakers rather than official channels like DF when it comes to Sony.

Sony will give DF about enough access as they want in order to promote relations with the community, but there is definitely an arms-length type situation with them. Sony's just not that type of company, they're pretty secretive and opaque.

DF is also heavily sponsored by Nvidia and has very close ties to Microsoft.

Access isn't the problem, DF just aren't very good.

Their only expertise is as PC bench-markers. Which is to say, they switch components in and out on otherwise like-for-like systems and make comparisons. A level of analysis wholly unsuited to judging system performance on non-PC systems because it fails to account for differences in software stack and i/o pipeline.

They aren't coders, and so they have minimal actual technical experience outside of that which is spoon-fed to them by "friendly" partners. Partners who basically use them as marketing/

Sorry, but if people haven't realized their limitations by now, I really do not know what to say!
 

dgrdsv

Member
ReBAR is disabled for TLOUP2 by default in Nvidia's driver so if anyone is getting a speed up from that then they've force enabled it previously.
I definitely don't get anything above 100 fps on my 4090 and I didn't do anything with ReBAR.
These YT "magic performance solutions" are bullshit almost every time.

Sharpening forced by ND is horrible
I almost feel like ND consider this a "special grittiness filter" more than just a sharpening one. Which explains their insistence on keeping it on all the time.
 

kevboard

Member
Access isn't the problem, DF just aren't very good.

Their only expertise is as PC bench-markers. Which is to say, they switch components in and out on otherwise like-for-like systems and make comparisons. A level of analysis wholly unsuited to judging system performance on non-PC systems because it fails to account for differences in software stack and i/o pipeline.

They aren't coders, and so they have minimal actual technical experience outside of that which is spoon-fed to them by "friendly" partners. Partners who basically use them as marketing/

Sorry, but if people haven't realized their limitations by now, I really do not know what to say!

you don't need any programming knowledge to understand that this is a bad port.

neither do you need programming knowledge to know that a PS4 quality game should only need around a GTX1060 level GPU to reach 60fps at PS4 quality settings.


edit: to put the GTX1060 into perspective when it comes to relative performance to a PS4,
here's a game that ran at 1080p 30fps on PS4, and 900p 30fps on Xbox One.
it is important to have the Xbox One version mentioned here as well, as it running at 900p indicates that the PS4 is indeed probably GPU limited and isn't kept at 30fps solely due to the CPU, as it was necessary to have the typical 40% resolution difference on Xbox One to keep the settings on par with PS4.

so again, there is what it looks like when you run a 1080p 30fps PS4 game, at 1440p near max settings on a 4th gen Intel i5 and a GTX1060:


if you skip to 6:45 you see it running at above 30fps at native 4k.
this is what a good PC port looks like.
 
Last edited:

dgrdsv

Member
Judging from the benchmarks this would run fine at 1080p on my 5600x/RX 7700 XT. That's a mid-range shitbox at this point (or any point in time really).
I don't see the problem.
Well the problem is that the port is done in a minimizing the budget way but somehow ND/Nixxes are trying to paint that as a great port - which it isn't.
The only reason why it's running better than the TLOUP1 port did is because it's a PS4 game while the latter was a PS5 remake.
Most of issues which we've seen in ND's engine on PC even back in UC port are still with us in TLOUP2 - which is a bit insane if you consider how much time was (or at least could've been) spent on these ports in total.
Sony's approach to spend as little as possible on porting their games on PC has been the reason for all their porting issues over the last years.
If they plan to continue with that then they have to seriously reconsider their approach to game development to pivot that into multiplatform multitarget from the start instead of relying on porting a 100% console product later.
Or cough up the dough needed to make proper ports.
Otherwise we'll just continue to see these disappointing releases which perform some 10s of % worse than they should on PC h/w.
 

Gaiff

SBI’s Resident Gaslighter
Maybe there's a language barrier between us, but to me "decent" means something good, and from what I've seen, this port is far from good, and even pretty fu$% far from OK.
By "decent", I just mean okay or acceptable, a step below good, but also a step above bad.

1. Horrible
2. Bad
3. Decent
4. Good
5. Great

Access isn't the problem, DF just aren't very good.

Their only expertise is as PC bench-markers. Which is to say, they switch components in and out on otherwise like-for-like systems and make comparisons. A level of analysis wholly unsuited to judging system performance on non-PC systems because it fails to account for differences in software stack and i/o pipeline.

They aren't coders, and so they have minimal actual technical experience outside of that which is spoon-fed to them by "friendly" partners. Partners who basically use them as marketing/

Sorry, but if people haven't realized their limitations by now, I really do not know what to say!
Even if they were programmers, it's not like they have the profiling tools to actually know what's going on. In that sense, the best they can do is provide a surface level analysis like the rest of us. They're not privy to proprietary software or a dev environment that allows them to truly see what's happening under the hood, so saying they aren't coders like it was relevant is puzzling to me. A programmer wouldn't do a much better job. It's not like they could tell you exactly why a PC GPU is performing at 50% of its hardware specifications compared to a PS GPU. All they'd do is educated guesswork, just like DF.

What DF failed at in this particular case is in the approach and methodology.
 
Last edited:

Clear

CliffyB's Cock Holster
you don't need any programming knowledge to understand that this is a bad port.

It really helps. Because if by your own admittance you don't have clue what you're talking about, its hard to take your opinion seriously.

Different methodologies have different stress-points, with the overall pipeline optimally being geared for its host environment. However if the environment changes, then sometimes major re-architecting is needed to accommodate restrictions imposed by the new host. Classic example being UE3's reliance on unified memory making direct ports from Xbox360 to PS3 typically suffer massive performance hits. Case in point, Bayonetta, a game which wasn't doing anything extreme on 360 yet ran like utter shit on PS3, a platform capable of way more graphically intensive titles doing superficially similar workloads.

The point being, sometimes the way a thing was written in the first place, makes for less performant ports than others specced for the exact same hardware.

This cannot be predicted based on graphics onscreen - its literally "judging a book by its cover"! And just as unintelligent.

neither do you need programming knowledge to know that a PS4 quality game should only need around a GTX1060 level GPU to reach 60fps at PS4 quality settings.

Its a port of the PS5 build because that's the most recent and the one with the most commonality with the PC version. So judging based on the uplift seen on on PS5 and perhaps even more accurately PS5 Pro (as that uses upscaling, VRR, and the usual stuff PC folks demand) makes vastly more sense than going back to native PS4 builds.

And if you do that, you start to see that going from a 4.2tf GPU (PS4) to a 16.7tf GPU (PS5pro) actually is less impactful than you might expect.
 
Last edited:

kevboard

Member
Its a port of the PS5 build because that's the most recent and the one with the most commonality with the PC version. So judging based on the uplift seen on on PS5 and perhaps even more accurately PS5 Pro (as that uses upscaling, VRR, and the usual stuff PC folks demand) makes vastly more sense than going back to native PS4 builds.

And if you do that, you start to see that going from a 4.2tf GPU (PS4) to a 16.7tf GPU (PS5pro) actually is less impactful than you might expect.

it is entirely irrelevant a port of which version it is.
if you need anything above a GTX1060 to run a game at 60fps that verifiably only reaches PS4 quality and in parts actually below PS4 quality due to mismatches in shadow draw distance, then that is absolutely not a good port.

especially given that the PS5 version barely improved on the graphics of the PS4 version, AND given that the performance of the PS5 version falls in line with expectations from the PS4 Pro version.
the PS5 version doubled performance, increased foliage draw distance, better texture filtering, and slightly higher resolution shadows.

so the PS4 falls in line with the PS4 Pro, and the PS5 falls in line with both what what we'd expect.
the PC version doesn't fall in line with the performance we should expect
 

simpatico

Member
The port is absolutely fine for people buying the game to play it. Runs like a champ. What reasonable system doesn’t get 60fps? I’m not seeing performance backlash outside of benchmarking conversations. Steam reviews are pretty clean
 

Hensen Juang

Neo Member
Am I? You're the one who started with the disrespectful callout


^That's my OP. Where's the disrespectful callout? I tell you where, it's all in your head. Unfortunately I stepped in an already heated conversation you were having with a different user, and you snapped at me because you were in the mindset of having to prove a point. I just provided additional context, I wasn't attacking you, I wasn't try to prove you wrong: I just extended the context, nothing more, nothing less. This about the "coding to the metal" thingy.

and then went on, "lol, you don't even know the difference between GNM and GNMX" when I clearly explained the reason why I named both

You explained that after my OP. You clearly explained the reason, after my post. Relevant detail. Was it here the "disrespectful callout"? Let's see:

Putting GNM and GNMX in the same sentence, interchangeably, it's really... something. I envy the level of confidence.

Do you call ^that disrespectful? If that so, very thin-skinned.

Now, I get the fact that you're a regular here, while I am the new kid; you're resident gaslighter and I can clearly see why, but don't you try and spin your BS accusations on me. Wrong guy. If you need to act like you know everything and you're always right, then we're not going to have a good time me and you. That said, I'd like to avoid conflict, I avoided conflict yesterday, I've been lurking the boards since long time, I often find myself agreeing with you. So let's try to make it work.

He is not being aggressive, that is you who is escalating with the condescending tone.

Read the above.

Now, nobody who even has passive knowledge of game development would argue that given a similar spec, a console will perform better than its PC counterpart if the same game is made specifically for it. The point of contention for TLOUII is that it exhibits an abnormal disparity not seen in games such as HFW or Rift Apart.

Genuine question, I mean no disrespect, not trying to call you out, I am hands up: why do you keep comparing different games, made by different studios, running on different engines. Sure, there can be a ballpark, sort of, but still.

Part of me thinks it might be deliberate. He's trying to muddy the waters and shift the blame from the CPU and IO to something the devs did wrong with the GPU. He repeatedly mocked the "power of the SSD" on the PS5 and this seems to go against his narrative. If he had used a fast CPU (and entire system, really) and gotten better results, he would have had to admit that the memory+CPU+I/O on a budget PC in no way can keep up with a PS5, validating what he argued against. I also don't believe for a second he wasn't aware of the changes because he sat down with the developers and they explained him the tweaks and improvements on the PS5 version. This included enhancements to the decompression algorithms and data streaming.

Maybe I'm being paranoid, but based on other videos, his methodology was extremely amateurish and we know for a fact he knows better. For instance, instead of blaming the entire system the 3060 was running on, he blamed the GPU alone. That's a no-no because he turned a system benchmark into a GPU benchmark and he knows this.

You're not being paranoid, you're 100% spot on. Also this review was clearly rushed.

Alex is mad because they didn't redo the assets and textures and add in RTGI

The moron did actually complain that they didn't add RTGI. Dude is tunnel-visioned beyond repair.

It doesn't help that Sony doesn't really give them the type of access they'd like (though I think they give them more than they give most people). PlayStation Productions probably has a stronger tie to the influencer community than PlayStation Studios or SIE.
I think they get "enough" access, but it is definitely less than they get from other companies and I also think it's why Sony went with CNET over DF for the PS5 Pro reveal.You can tell that they're butthurt over it when they don't have answers to questions. Most news filters through leakers rather than official channels like DF when it comes to Sony.

Sony will give DF about enough access as they want in order to promote relations with the community, but there is definitely an arms-length type situation with them. Sony's just not that type of company, they're pretty secretive and opaque.

DF is also heavily sponsored by Nvidia and has very close ties to Microsoft.

100% spot on.
 
Bit of a bummer that it looks worse than Part 1 but I guess that makes sense as it was made first. Runs absolutely fine though. Ironically I recently played Uncharted 2 and the snow looks better in that game than it does here - the walking in snow animation is better too. Just about everything about Uncharted 2 seems better than TLOU2 but I guess it's early days.

Also I can't believe who is voicing Abby - the same woman who does Mary Jane and Aloy. I swear she plays the most hateable characters around - but it's even worse here because it doesn't fit the character at all.
 

AZRoboto

Neo Member
Bit of a bummer that it looks worse than Part 1 but I guess that makes sense as it was made first. Runs absolutely fine though. Ironically I recently played Uncharted 2 and the snow looks better in that game than it does here - the walking in snow animation is better too. Just about everything about Uncharted 2 seems better than TLOU2 but I guess it's early days.

Also I can't believe who is voicing Abby - the same woman who does Mary Jane and Aloy. I swear she plays the most hateable characters around - but it's even worse here because it doesn't fit the character at all.
Ashly Burch does Aloy. Laura Baily does Abby and MJ
 

Clear

CliffyB's Cock Holster
it is entirely irrelevant a port of which version it is.

Wrong, its very relevant in this case.

As I mentioned in a previous post; asset layout and streaming were completely revamped between PS4 and PS5. Which is a key area, because it doesn't matter how fast your GPU can process triangles if the main throttle point is pulling in the source data, swizzling it into VRAM and then drawing it.

This specific part of the draw pipeline is where PC seems to be getting stressed hardest judging from recent titles like MH Wilds. Illustrating my point about what happens when host architectures have different strengths and weaknesses.
 
The port is absolutely fine for people buying the game to play it. Runs like a champ. What reasonable system doesn’t get 60fps? I’m not seeing performance backlash outside of benchmarking conversations. Steam reviews are pretty clean
It depends on your expectations. If people are happy with just 60fps and dont mind small glitches, they will be happy with the TLOU2 port on PC. I however would like to see similar performance as in Uncharted 4. The U4 port wasnt the best because there were some missing graphics effects (mainly in the cutscenes) and the game was already way more demanding compared to other PS4 ports, but performance was still acceptable even without any frame generation. My 7800X3D was limiting my framerate at around 190-240fps, while in TLOU2 it satarts limiting at 80fps and if I will not cap framerate at 60fps I will see stuttering (from time to time when game is decompressing data in the background). DLSS FGx2 can help in CPU limited situations, but for some strange reason DLSS FG only adds 20fps in this game, which means it reduces the framerate by half :p. I will try disabling the ReBAR and see if that helps in CPU limited locations.

1440p max settings with just DLSS Quality.


u4-2025-04-07-01-18-50-886.jpg


u4-2025-04-07-01-15-04-831.jpg


CPU limited at 192fps

u4-2025-04-07-01-18-12-648.jpg


u4-2025-04-07-01-23-28-468.jpg


u4-2025-04-07-01-23-44-333.jpg
 
Last edited:

Stuart360

Member
The game is actually easy to run, at 60fps anyway. My 3070 is hanging around 50% usage at 1080p max settings. It runs that well in the first part of the game anyway in the town, not sure about the more open areas later.
Its a bit easier than the Orig game remaster, foir me anyway.

The only thing i dont like is how they changed the shader compilation thing to happen ingame as you're playing. It blasts my cpu and can cause frame drops, although only for a few seconds.

I think i would prefer the compilation to happen before the game like the first game.Although that game did have the longest shader compilation run i have eveer seen, like 30mins for me lol. In fact it was so long that by the time it got to about 90% done, my cpu was so hot that it was thermal throttling lol.
 
It's based on streaming. So whatever new stuff is streamed in, then the PSO compiles start.
Actually, the system lends itself very well to DirectStorage. We're just using CPU decompression, without GPU decompression.
Certainly using async compute with the PS5, where you know exactly what the hardware is and what things pair well together, and there's less driver in the middle, we've always found it to be a lot more beneficial on consoles than it is on PC, unfortunately.
One thing is the spin locking. That is cheap on the console, but on Windows, that can be very problematic for performance.




It's an interesting article that discusses the strengths and weaknesses of the engine on consoles and PC.
PS5's strengths: single shader/material permutations, shared memory pool, faster async compute, cheaper CPU multi threading cost(with Kraken decompression + DMA , Tempest 3d Audio)
https://www.eurogamer.net/digitalfoundry-2025-the-last-of-us-part-2-tech-interview
Of course. They did and interview with the team who ported Spider-man saying similar things. The reality is that the main bottleneck this generation is I/O, exactly as prediced by Cerny, and the PS5 punches way above its weight because it has been designed around that (and the unified memory surely helps a lot vs PC). But they are so biased and absolutely want to twist their benchmarks since 4 years ago (because they were laughing at Cerny who "overengineered PS5 I/O") to blame anything (bad port, GPU etc), but not the elephant in the room: The decades years old obsolete APIs / hardware their loved 3000$ PCs get for their I/O pipeline.
 
Last edited:

Bojji

Member
Of course. They did and interview with the team who ported Spider-man saying similar things. The reality is that the main bottleneck this generation is I/O, exactly as prediced by Cerny, and the PS5 punches way above its weight because it has been designed around that (and the unified memory surely helps a lot vs PC). But they are so biased and absolutely want to twist their benchmarks since 4 years ago (because they were laughing at Cerny who "overengineered PS5 I/O") to blame anything (bad port, GPU etc), but not the elephant in the room: The decades years old obsolete APIs / hardware their loved 3000$ PCs get for their I/O pipeline.
Yet, the best looking games this gen are not bottlenecked by I/O on PC. Majority of games on PS5 don't even use hardware decompression properly (even some sony games).

Last of us 2 was designed for PS4 with HDD (less than 100MB/s) and 1.6GHz Jaguar.

Holy cow, I disabled reflex and my framerate improved a lot, the game is no longer CPU limited. I need to do more testing to see how this game performs in the later levels.

Reflex enabled - 7800X3D CPU

1.jpg


Reflex disabled

2.jpg

So it's not related to rebar but to bugged reflex?
 

yamaci17

Gold Member
Yet, the best looking games this gen are not bottlenecked by I/O on PC. Majority of games on PS5 don't even use hardware decompression properly (even some sony games).

Last of us 2 was designed for PS4 with HDD (less than 100MB/s) and 1.6GHz Jaguar.



So it's not related to rebar but to bugged reflex?
it is possible it has nothing to do with any of it

it possibly happens at times and a restart of the game or changing settings fixes it. just my guess
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Yet, the best looking games this gen are not bottlenecked by I/O on PC. Majority of games on PS5 don't even use hardware decompression properly (even some sony games).

Last of us 2 was designed for PS4 with HDD (less than 100MB/s) and 1.6GHz Jaguar.



So it's not related to rebar but to bugged reflex?

it is possible it has nothing to do with any of it

it possibly happens at times and a restart of the game or changing settings fixes it. just my guess
Does turning off ReBar disable Reflex?
 
Yet, the best looking games this gen are not bottlenecked by I/O on PC. Majority of games on PS5 don't even use hardware decompression properly (even some sony games).

Last of us 2 was designed for PS4 with HDD (less than 100MB/s) and 1.6GHz Jaguar.



So it's not related to rebar but to bugged reflex?
After further testing I think the problem is more complicated than just "reflex", or ReBAR (BTW- I tried turning off ReBAR in BIOS, but I havent noticed any difference). I think by changing the graphics settings (it does not necessarily have to be "reflex") the game can kill background processes that consume hardware resources, and then allocate these resources properly resulting in huge performance boost.

Usually DLSS Quality brings big performance boost in this game, but when the game starts doing something in the background (shader precompilation?) DLSS only boost framerate by 5fps :p


The game can also sometimes push my 7800X3D into thermal throttling, and not even Cinebench stress my CPU that much.
 

Gaiff

SBI’s Resident Gaslighter
After further testing I think the problem is more complicated than just "reflex", or ReBAR (BTW- I tried turning off ReBAR in BIOS, but I havent noticed any difference). I think by changing the graphics settings (it does not necessarily have to be "reflex") the game can kill background processes that consume hardware resources, and then allocate these resources properly resulting in huge performance boost.

Usually DLSS Quality brings big performance boost in this game, but when the game starts doing something in the background (shader precompilation?) DLSS only boost framerate by 5fps :p


The game can also sometimes push my 7800X3D into thermal throttling, and not even Cinebench stress my CPU that much.
So same problem as AC Shadows where performance inexplicably tanks and recovers if you change the settings or relaunch the game?
 

yamaci17

Gold Member
Does turning off ReBar disable Reflex?
no they have no relation

I have disabled resizable bar long ago because it increases VRAM usage and causes performance problems in cyberpunk and forza horizon 5 with VRAM intensive settings. I'm not going to bother with that feature just to get a 5% performance boost. only when and if I get a 16 GB GPU I'd enable it. I'd actually be scared of that setting even if I got a 12 GB GPU (considering how vram intensive games are these days). resizable bar can increase VRAM usage from 0.7 GB to 1.2 GB in my tests which is not cool. even in games that have dynamic texture streaming, that would actually mean 1 GB less texture streaming budget just to get a 5% performance boost (so you would get 5% performance boost but have poor textures which you may or not may notice of course)

only games I've benefitted from the 5% performance boost without VRAM related issues were ac valhalla and rdr 2. games where I did not even need any performance boost to begin with

that is why it is probably supported in so few games. it can be trouble for 8-12 GB GPUs. 16-24 GB GPU people should give it a try in every game to see if it actually increases performance or not
 

Gaiff

SBI’s Resident Gaslighter
no they have no relation

I have disabled resizable bar long ago because it increases VRAM usage and causes performance problems in cyberpunk and forza horizon 5 with VRAM intensive settings. I'm not going to bother with that feature just to get a 5% performance boost. only when and if I get a 16 GB GPU I'd enable it. I'd actually be scared of that setting even if I got a 12 GB GPU (considering how vram intensive games are these days). resizable bar can increase VRAM usage from 0.7 GB to 1.2 GB in my tests which is not cool. even in games that have dynamic texture streaming, that would actually mean 1 GB less texture streaming budget just to get a 5% performance boost (so you would get 5% performance boost but have poor textures which you may or not may notice of course)

only games I've benefitted from the 5% performance boost without VRAM related issues were ac valhalla and rdr 2. games where I did not even need any performance boost to begin with

that is why it is probably supported in so few games. it can be trouble for 8-12 GB GPUs. 16-24 GB GPU people should give it a try in every game to see if it actually increases performance or not
Yeah, I know they have no relation. I was just wondering if maybe there was some weirdness in this game that causes Reflex to turn off by disabling ReBar.

It seems mostly fine, but there are some strange bugs that need addressing.
 

yamaci17

Gold Member
Yeah, I know they have no relation. I was just wondering if maybe there was some weirdness in this game that causes Reflex to turn off by disabling ReBar.

It seems mostly fine, but there are some strange bugs that need addressing.
i wish it didn't have any of these issues
Sad Cat GIF
 
So same problem as AC Shadows where performance inexplicably tanks and recovers if you change the settings or relaunch the game?
I have no idea. I'm playing AC: Origins at the moment and it will be a while before I play AC Shadows.

All I can say is performance fluctuate a lot, even in the same location, and even if you don't change any settings.

a2.jpg


Few minutes later

a1.jpg



And if you move the camera in the other direction, the frame rate jumps from 87 fps to 168 fps.


5.jpg


This game often dips to 80fps territory for whatever reason. It's impossible to tell if that's CPU bottleneck, shader precompilation, or maybe GPU bottleneck. Sometimes performance can double if you change something in the graphics settings.

tlou-ii-2025-04-07-15-02-52-391.jpg



The only way to play this game on my PC without huge fps fluctuation is to set the frame rate cap to 60fps, or maybe 80fps.

But even at 80fps, the game is much more responsive than the PS4Pro version, so overall PC port offer much better experience compared to the PS4. However, I recommend tweaking the gamepad settings, as the default settings are just terrible and I struggled to aim at anything. With these settings I could aim much easier:


uOnhUlr.jpeg
 
Last edited:
The camera movement and input lag is awful. Game feels like going through treacle.

120FPS does not feel any different from 60FPS.

Based on system requirements it should run 1440p/60fps.

Sharpening forced by ND is horrible, game needs mod to remove it:

1.png


I see it in PS4 version played on PS5 as well (but they amplified it in PS5/PC port)...
I've got Sharpening set at 0 but it's already way too oversharpened (particularly with DLSS Transformer). Image quality looks good but ND need to patch this.
 

AFBT88

Member
The game is actually easy to run, at 60fps anyway. My 3070 is hanging around 50% usage at 1080p max settings. It runs that well in the first part of the game anyway in the town, not sure about the more open areas later.
Its a bit easier than the Orig game remaster, foir me anyway.

The only thing i dont like is how they changed the shader compilation thing to happen ingame as you're playing. It blasts my cpu and can cause frame drops, although only for a few seconds.

I think i would prefer the compilation to happen before the game like the first game.Although that game did have the longest shader compilation run i have eveer seen, like 30mins for me lol. In fact it was so long that by the time it got to about 90% done, my cpu was so hot that it was thermal throttling lol.
What about the openning where you get out of the garage with Tommy, you get 60+ at Max with 3070? Volumetric effects at Very High will make your FPS go dow to low 40s at that opening section even at 1080p. Put that settings to Medium and you get 75+ FPS.
 

keefged4

Member
I'm giving up for now and waiting for a patch, the game crashes constantly the longer you play it. It's unacceptable in it's current state. I have just under 6 hours played now and its crashed more than 10 times. Joke.

EDIT - latest update and reverting Nvidia Drivers to 566.36 have resolved the crashing. Not sure which one or if both working together has solved it
 
Last edited:

dgrdsv

Member
This looks quite reasonable to me?
3060 not being able to run what is effectively a PS4 game in 1440p with 2X upscaling (so 1080p base) at 60 fps isn't very "reasonable" though.
3060 should be very close to PS5 GPU in performance when used properly.
 
Last edited:


To the person asking why actual coding experience MATTERS, here's your proof.

This is actual technical analysis, not DF bullshit.

NXgamer has pointed out that you can reduce the resolution and detail and get a worse performance compared to the max settings. This needs to get patched. For now, I will play other games and plan to return to TLOU2 when I upgrade my monitor to QD-OLED (probably PG27UCDM).
 

Aaron07088

Neo Member
Whats going on this game with cpu frequancy? Sometimes my cpu just working low frequancy like 700-1500 mhz (ryzen 5 5600) When i enabled FG its just work with high frequancy but without fg just terrible. Dont mention to pso. I know how its work in this game but its not about that im waiting like 3-4 min for pso and still saw low frequancy
 
BTW. uncharted collection on pc is also based on the ps5 version, yet it runs a lot faster on my pc (190-240fps as shown on my screenshots) and doesnt stutter, or have issues with GPU or CPU ututilisation. TLOU2 has has similar (if not the same) engine, yet it has problems on PC.
 
Last edited:

Stuart360

Member
What about the openning where you get out of the garage with Tommy, you get 60+ at Max with 3070? Volumetric effects at Very High will make your FPS go dow to low 40s at that opening section even at 1080p. Put that settings to Medium and you get 75+ FPS.
It actually hangs in the 80-100fps range if i turn off vsync. I play on a 60hz tv so no point.
And year the part with Tommy ran fine (the part when you exit the garage and get on the horses right). I turned off Nvidia reflex though which apparently helps a lot with cpu usage on this game (and it def lowered my cpu usage). In fact i havent even noticed the shader compilation effecting the framerate now after turning off Reflex.
The only setting that isnt on max is LOD, which is one notch down, as apparently you can save 10+fps by doing that woth no visible loss in quality. So i'm not max i suppose but everything else is on max.
It really isnt as hard to run as people are making out. Unless again, those open areas are way worse, which i havent got to yet (only been playing about 90mins or so, just testing the game really).
 
Last edited:

Gaiff

SBI’s Resident Gaslighter


To the person asking why actual coding experience MATTERS, here's your proof.

This is actual technical analysis, not DF bullshit.

Much better analysis and I'm glad he confirmed the ReBar bug. I also saw a poster on Beyond3D ask Alex about it and point to potential mistakes in DF's video and Alex got defensive. The poster also asked him a few times about ReBar and Alex just ignored it. NxGamer, on the other hand, spends 80% of the video talking about it because there's obviously something wrong with it with NVIDIA cards.

He speculates that the game isn't data streaming or IO bound, but memory bound because there's some fuckery going on with the allocation on PC that potentially causes stalls. He also says he's guessing, so he's not 100% certain either, but that's certainly more thorough than just going with the game being GPU-bound.

Also, I wouldn't even say coding experience matters for that. Not being a dumbass and having an open mind matters. Him confirming the ReBar stuff makes it blatant there's something going on with the memory because that's literally what ReBar does (it's in the name).

He also flat out admits the team could have done a better job and that there are issues. I'm looking forward to his full review because it's obvious Alex won't do it again. I'm slightly annoyed at this because he needs to, otherwise, his misinformation becomes disinformation and that's not acceptable.

BTW. uncharted collection on pc is also based on the ps5 version, yet it runs a lot faster on my pc (190-240fps as shown on my screenshots) and doesnt stutter, or have issues with GPU or CPU ututilisation. TLOU2 has has similar (if not the same) engine, yet it has problems on PC.
Yeah, but TLOUII is also a lot more demanding. Uncharted 4 runs at 100fps+ in Performance Mode on PS5. TLOUII runs at ~70-80fps.
 
Last edited:

AFBT88

Member
It actually hangs in the 80-100fps range if i turn off vsync. I play on a 60hz tv so no point.
And year the part with Tommy ran fine (the part when you exit the garage and get on the horses right). I turned off Nvidia reflex though which apparently helps a lot with cpu usage on this game (and it def lowered my cpu usage). In fact i havent even noticed the shader compilation effecting the framerate now after turning off Reflex.
The only setting that isnt on max is LOD, which is one notch down, as apparently you can save 10+fps by doing that woth no visible loss in quality. So i'm not max i suppose but everything else is on max.
It really isnt as hard to run as people are making out. Unless again, those open areas are way worse, which i havent got to yet (only been playing about 90mins or so, just testing the game really).
Can you make a video of that specific location run? Im sorry but im not buying that you're getting 80+ at max settings and without upscaling with a 3070. I've got a second system with 3080/7800x3d and my 3080 gets fully maxed out at that scene at native 1080p and it goes to 55-57fps. Again only happened there as the Volumetric Fog is very intense there, but same thing happened with my 4080s and anyother benchmark i've seen has been the same way. Lowering that setting alone makes the game run at 80+ in that area and usually at 120+ anywhere else.
 

Gaiff

SBI’s Resident Gaslighter
Can you make a video of that specific location run? Im sorry but im not buying that you're getting 80+ at max settings and without upscaling with a 3070. I've got a second system with 3080/7800x3d and my 3080 gets fully maxed out at that scene at native 1080p and it goes to 55-57fps. Again only happened there as the Volumetric Fog is very intense there, but same thing happened with my 4080s and anyother benchmark i've seen has been the same way. Lowering that setting alone makes the game run at 80+ in that area and usually at 120+ anywhere else.
ReBar on or off?
 

Lysandros

Member
Much better analysis and I'm glad he confirmed the ReBar bug. I also saw a poster on Beyond3D ask Alex about it and point to potential mistakes in DF's video and Alex got defensive. The poster also asked him a few times about ReBar and Alex just ignored it. NxGamer, on the other hand, spends 80% of the video talking about it because there's obviously something wrong with it with NVIDIA cards.

He speculates that the game isn't data streaming or IO bound, but memory bound because there's some fuckery going on with the allocation on PC that potentially causes stalls. He also says he's guessing, so he's not 100% certain either, but that's certainly more thorough than just going with the game being GPU-bound.

Also, I wouldn't even say coding experience matters for that. Not being a dumbass and having an open mind matters. Him confirming the ReBar stuff makes it blatant there's something going on with the memory because that's literally what ReBar does (it's in the name).

He also flat out admits the team could have done a better job and that there are issues. I'm looking forward to his full review because it's obvious Alex won't do it again. I'm slightly annoyed at this because he needs to, otherwise, his misinformation becomes disinformation and that's not acceptable.


Yeah, but TLOUII is also a lot more demanding. Uncharted 4 runs at 100fps+ in Performance Mode on PS5. TLOUII runs at ~70-80fps.
Wow, Alex got defensive? Who would have guessed it. :) To be fair Dictator's disinformation book is quite thick, that's just another line to be added.
 
Yeah, but TLOUII is also a lot more demanding. Uncharted 4 runs at 100fps+ in Performance Mode on PS5. TLOUII runs at ~70-80fps.
Uncharted 4 on the PS5 runs at 100fps+, but according to Elanalistadebits only at 1080p 120fps. That would explain why TLOU2 on the PE5 has much worse performance, because it's running at higher resolution.

fDddNlF.jpeg
Yt8Yf5Q.jpeg
 

YeulEmeralda

Linux User
The port is absolutely fine for people buying the game to play it. Runs like a champ. What reasonable system doesn’t get 60fps? I’m not seeing performance backlash outside of benchmarking conversations. Steam reviews are pretty clean
It seems that there is this agenda to make PC gaming look bad.
Like bitch please can I mod games on playstation or disable eye cancer like depth of field? No so fuck off.
 
Top Bottom