• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Does PSSR need to change to a transformer model?

yamaci17

Member
It should be possible with the DirectStorage API and NVMe SSDs (PCIe 3.0/4.0).

I'm not sure why they don't do it...

PCIe 4.0 mobos/SSDs are dirt cheap these days.
well I don't know about that. final fantasy 7 rebirth uses directstorage supposedly but I see no benefit at all. you have to play with horrible low textures to REDUCE VRAM related stutters (not even get rid of them)

also pcie 4.0 won't matter for the most popular 8 GB card (4060) as it only it is only pcie 4.0 8x. meaning it has same bandwidth available to it as pcie 3.0 16x cards.



23:49

while alex tried to portray "medium" textures as somewhat decent (but still stutters), they actually are horrible as well



7:05
 
also pcie 4.0 won't matter for the most popular 8 GB card (4060) as it only it is only pcie 4.0 8x. meaning it has same bandwidth available to it as pcie 3.0 16x cards.
It does matter, because it would be slower with PCIe 3.0.

What makes you think 16 GB/s is not fast enough to fill the 8GB framebuffer in half a second? (assuming uncompressed textures, otherwise with GDeflate it should be even less)

To me it seems this game lacks optimization... does it perform better on PS5?
 

dgrdsv

Member
You didn't clarify if nVidia utilized a die shrink or not.

A die shrink justifies a price reduction... especially 20-25 years ago, when wafers were dirt cheap compared to today.

I'm not sure if OG XBOX had die shrinks, that's why I'm asking.
I honestly don't remember.
And it's not an Nvidia issue anyway - if MS would want to have a conditional price reduction on the parts they were buying from Nvidia then they should've put that into the contract.
If they didn't then why would any company decline to get some additional profits when they can?
 
well I don't know about that. final fantasy 7 rebirth uses directstorage supposedly but I see no benefit at all. you have to play with horrible low textures to REDUCE VRAM related stutters (not even get rid of them)

also pcie 4.0 won't matter for the most popular 8 GB card (4060) as it only it is only pcie 4.0 8x. meaning it has same bandwidth available to it as pcie 3.0 16x cards.



23:49

while alex tried to portray "medium" textures as somewhat decent (but still stutters), they actually are horrible as well



7:05

Isn't the game a pretty much locked 60fps on PS5 (apart from drops happening when tons of alpha effects are onscreen)? What textures settings are used on PS5 (and Pro)?
 

yamaci17

Member
To me it seems this game lacks optimization...

there you have it

the game runs fine on 12-16 GB cards mostly. it needs "optimization" for 8 GB cards. it doesn't need any special attention or optimization for 12+ GB cards. that's the whole point. don't expect much from developers on this regard. 8 GB GPU users are already being tested to their limits (remember last of us part 1 with ps2 textures). it is clear that developers are struggling massively to optimize for 8 GB cards even at 1080p buffer. i don't think directstorage, pcie 4, or fast SSDs can remedy this issue. if they could, we wouldn't really see 24-32 GB 4090s and 5090s. at some point you have to have the VRAM to store actual stuff in it. that is even why most pc games have extremely high RAM usage because even streaming data from SSD to RAM or VRAM is just not optimized on PC. this stuff simply doesn't function properly on PC. most developers instead rely on using a lot of RAM and use that for data streaming. and that's fine (RAM is cheap and most people have moved on to 32 GB). but even relying on streaming from RAM to VRAM reduces the performance immensely.

it's clear that directstorage is not achieving whatever it aims to achieve. otherwise this wouldn't happen;



stupid gpu only gives you the actual performance you're supposed to get from it with low textures. anything higher tanks performance because pcie is used to full (I know because exact same thing happened in ratchet and clank and that game runs much better on pcie 4).

those medium textures also look horrible in spiderman 2 and engine eventually decides to disregard your preference and load lower quality textures anyways. this is what the engine does when you actually use medium textures at 1440p buffer with DLSS quality;



only way to get consistent performance and somewhat ps4 looking textures is to play at 1080p with dlss quality and ray tracing disabled. and that's okay. that's the whole point I'm arguing about. 8 GB GPUs are not meant for 4K output (since he thinks 3060ti should target output 4K...) when even nixxes can't make these GPUs work properly with 1440p output, there's no point arguing. they're 1080p output cards through and through.

Isn't the game a pretty much locked 60fps on PS5 (apart from drops happening when tons of alpha effects are onscreen)? What textures settings are used on PS5 (and Pro)?

I'm sure that specific texture won't look like that on a ps5 and ps5 pro. texture settings have nothing to do with actual performance. you either have enough VRAM for them or not. and if you have 10-12 GB VRAM on desktop, you get smooth gameplay with high textures. ps5 has 12.5 GB or so memory to work with.
 
Last edited:
if they could, we wouldn't really see 24-32 GB 4090s and 5090s.
No current-gen game utilizes 24-32GB VRAM, not even the most unoptimized one.

These cards are made for AI first and foremost, not gaming (that's a bonus).
only way to get consistent performance and somewhat ps4 looking textures is to play at 1080p with dlss quality and ray tracing disabled. and that's okay. that's the whole point I'm arguing about. 8 GB GPUs are not meant for 4K output (since he thinks 3060ti should target output 4K...) when even nixxes can't make these GPUs work properly with 1440p output, there's no point arguing. they're 1080p output cards through and through.
I never said they should target 1440p.

Even RTX 4070 (1440p GPU) shouldn't necessarily target native 1440p in DLSS-enabled games...

Native 720p upscaled to 1440p (DLSS4 Performance) yields pretty decent quality and it reduces VRAM pressure quite a bit.

x60 GPUs should render at native 540p and upscale to 1080p via DLSS4.

With DLSS4 there's no need for Quality preset (unlike in the DLSS3 era).
I'm sure that specific texture won't look like that on a ps5 and ps5 pro. texture settings have nothing to do with actual performance. you either have enough VRAM for them or not. and if you have 10-12 GB VRAM on desktop, you get smooth gameplay with high textures. ps5 has 12.5 GB or so memory to work with.
PS5 has 12.5GB of available unified memory for both the CPU and GPU. There's no way the GPU utilizes 12.5GB of memory. More like 8-10GB at most (the CPU also needs to store data).

Also, keep in mind OG PS5 does not support AI upscaling (let alone techniques such as ray reconstruction), which means the framebuffer will consume more memory.

Regarding DirectStorage, it was advertised by Microsoft to reduce VRAM usage. Same with Sony's proprietary SSD API.

If consoles didn't have this special API and they still utilized an HDD, they would need 32GB of RAM for caching (there's a reason the PS4 had an enormous 8GB pool back in 2013).

Believe it or not, DirectStorage makes a huge difference in terms of reducing bottlenecks when properly optimized (remember when DX12 first appeared 10 years ago? it performed worse than DX11, but it's clearly a superior API):
I88dc4g.png

Ideally you want the data (textures etc.) to go straight from the NVMe SSD to the GPU via PCIe, without using the CPU RAM as temporary storage/cache (nor taxing the CPU with decompression).

Imagine having a car and wanting to go from point A to point B: you want to follow the shortest route, not the longest one (more gasoline consumption)!
 
Last edited:

yamaci17

Member
No current-gen game utilizes 24-32GB VRAM, not even the most unoptimized one.

These cards are made for AI first and foremost, not gaming (that's a bonus).

I never said they should target 1440p.
i never said you did. the other guy did. my answer was to him, not to you

1080p is the output resolution, from native 540p. How is that acceptable on a 3060ti? This is Switch level of resolution here. On a Desktop GPU the output resolution should be min 4K, not 1080p!
 

yamaci17

Member
No current-gen game utilizes 24-32GB VRAM, not even the most unoptimized one.

These cards are made for AI first and foremost, not gaming (that's a bonus).

I never said they should target 1440p.

Even RTX 4070 (1440p GPU) shouldn't necessarily target native 1440p in DLSS-enabled games...

Native 720p upscaled to 1440p (DLSS4 Performance) yields pretty decent quality and it reduces VRAM pressure quite a bit.

x60 GPUs should render at native 540p and upscale to 1080p via DLSS4.

With DLSS4 there's no need for Quality preset (unlike in the DLSS3 era).

PS5 has 12.5GB of available unified memory for both the CPU and GPU. There's no way the GPU utilizes 12.5GB of memory. More like 8-10GB at most (the CPU also needs to store data).

Also, keep in mind OG PS5 does not support AI upscaling (let alone techniques such as ray reconstruction), which means the framebuffer will consume more memory.

Regarding DirectStorage, it was advertised by Microsoft to reduce VRAM usage. Same with Sony's proprietary SSD API.

If consoles didn't have this special API and they still utilized an HDD, they would need 32GB of RAM for caching (there's a reason the PS4 had an enormous 8GB pool back in 2013).

Believe it or not, DirectStorage makes a huge difference in terms of reducing bottlenecks when properly optimized (remember when DX12 first appeared 10 years ago? it performed worse than DX11, but it's clearly a superior API):
I88dc4g.png

Ideally you want the data (textures etc.) to go straight from the NVMe SSD to the GPU via PCIe, without using the CPU RAM as temporary storage/cache (nor taxing the CPU with decompression).

Imagine having a car and wanting to go from point A to point B: you want to follow the shortest route, not the longest one (more gasoline consumption)!
for all the other points, if and when directstorage can help me play with decent looking textures without stutters or huge performance slowdowns in spiderman 2 and final fantasy 7 rebirth at 1440p output, just let me know

you can use dlss ultra performance if you want, VRAM reduction coming from upscaling is not enough to alleviate the issue in these games. no matter what upscaling preset you choose, you have to play spiderman 2 with low textures and final fantasy 7 rebirth with low textures and accept horrible looking textures at 1440p output. and this solution already worked in the past before even directstorage was a thing. so it is clear that directstorage is not doing anything useful for these games. and funny part is both games use directstorage indeed.



just look at this video and tell me how directstorage helps. oh you're going to say its a broken port. so why are even discussing anything then? if more than half the games released in 2025 are broken for 8 GB GPUs at 1440p output with decent textures, then it has become the new normal. you cannot get smooth, high performance with decent textures at 1440p output with 8 GB VRAM anymore in recent games. upscaling or not. there's no point using a 8 GB at 1440p output if you have to use low quality textures. at least at 1080p output you get medium textures or whatever.

I can assure you the situation above never happens on PS5. and guess what, it uses ray tracing at a 4K output (internal resolution averages around 1080p). meanwhile I cannot even use medium textures that still look worse than ps5 at 1440p output with dlss performance. and yes, ray tracing settings match the ps5 (actually lower than ps5 as ps5 uses 7 or 8 as ray tracing object range)

then again, you can only stream so much data. you need a decent baseline to work with. and 8 GB is not it.
 
Last edited:
Top Bottom