• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PlayStation 5 Pro Review: The Digital Foundry Verdict

PandaOk

Member
Lmao. Gow literally ran at 4kcb which is literally 2x more pixels plus the reconstruction cost to get to 4k.

The 60 fps mode can hit 60 fps regularly. I played it for over 200 hours. It drops because of cpu bottlenecks. If the gpu was the bottleneck it would never hit 60 fps. Or run at 4kcb.
You really are completely clueless when it comes to hardware performance. It’s funny watching you embarrass yourself.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
IMO, even 4K DLSS performance looks incredible when compared to PS5 games running in performance mode. I recommend viewing these screenshots in full size by opening them in a new tab.

PS5 performance mode

25d10d16247e97b0712c.jpg


PC 4K DLSS Performance

b1-Win64-Shipping-2024-09-01-00-25-53-709.jpg



b1-Win64-Shipping-2024-09-01-00-07-52-582.jpg


I think console gamers would be really happy with 4K DLSS performance image quality because it's a massive improvement over the standard upscaling consoles use. If PSSR is similar in quality to DLSS, then it's a massive win for PS5P over the standard model.

You said the "DLSS performance" image looks too soft. I think DLSS/DLAA by itself will always look soft, just like TAA. To get a sharp image, you need to add a sharpening filter (I recommend reshade CAS and luma sharpening filters). Both DLSS Q and P modes look razor sharp with the use of sharpening filters.

DLSS performance

Horizon-DLSSP.jpg


DLSS Quality

Horizon-DLSSQ.jpg


4K DLSS Performance

SHR-DLSSP.jpg


DLSS Quality


SHR-DLSSQ.jpg


DLSS Performance

cyberpunk-DLSSP.jpg


DLSS Quality

cyberpunk-DLSSQ.jpg


DLSS Performance

BMW-DLSSP.jpg


DLSS Quality

BMW-DLSSQ.jpg


On the static image, both DLSS Q and P look like a 4K image. The biggest difference between the two is motion clarity. If something is moving, you can tell that the DLSS performance image has more artifacts around hair or leaves on a tree.
Compared to PS5 games yes. It looks great. But i was comparing it to 4k dlss quality. As soon as i turn it on, I can tell the image has been downgraded. It's still clean but it definitely doesnt look native 4k anymore. There is also some far distance detail that doesnt resolve as well as dlss quality does.

Regardless, 4k PSSR performance is what Sony shouldve designed this thing around. 864p is 1440p dlss balanced which simply isnt good enough for 4k tvs. You can see from FF7, Stellar Blade and other games that target a base resolution of 1080p-1440p that PSSR performs way better.

Unfortunately and this is something i said over and over again in the Pro leaks thread, but a lot of these next gen only games drop to 720p on base PS5 to maintain 60 fps and there just isnt enough raw GPU power to get them to 1080p base resolution.
 
Reviewer in my country said that Alan Wake 2 on his YT recordings looks like native 864p image, but on his TV the image quality looks comparable to 4K native.

Gamingtech also said that his footage does not do the PS5Pro justice.

 
Last edited:

Mr Moose

Member
He had an issue with my math on 720p vs 900p taking 45% more power. whats funny is that the difference in pixels is actually 56% so it would take 56% more gpu power to go from 720p to 900p. the fact that we are seeing 864p which sits somewhere in the middle means they cant even go from 720p to 900p.

Dragon Age also runs at 864p on the pro. though they do add rtao which has a minimal hit on pc so they probably thought why not.

that figure keeps coming up and it just screams lack of raw gpu power to me. the 30% upgrade in rasterization is far worse than sonys own conservative 45% estimates. that means that gpu is performing half as good as it should. its an xsx caliber fuck up by the engineering team at sony. paying for all that silicon only to use half of the power available.
You know that isn't how it works.
05Mm54mFBkVdHt6nWNDBhVr-10..v1630529049.png
 
Last edited:

Topher

Identifies as young
m
I think it's pretty sad that pushing back and genuinely feeling that this is a mediocre upgrade for 830 pounds in the UK is trolling.

Let me explain how I see it.

I was just giving you shit but you'll turn it into endless drama anyway. Pro ain't for you. That's all you have to say. That ain't trolling.

Now going from "not for me" to this.....

MWTTwV1.png



Astro Bot all over again.

Sarcastic Sam Smith GIF by Apple Music
 
Last edited:

SlimySnake

Flashless at the Golden Globes
It's simple - the equation of 'pixel count' or 'framerate' = 'flops' isn't a good metric. PS4 Pro is just an example of how that doesn't pan out.
Taking a broad-view, PS4 Pro real-world performance increase was somewhere in the 50%-70% range (and I can back that with real metrics when I worked on the hw). Lower than that if you just looked at unmodified code running (like uncapped games boosting).
Basically - if you argue PS5 Pro increase is underperforming the specs based on that, so was the PS4 Pro (arguably more so - but that's semantically dependant on which paper spec you pick).


There were also games on PS4Pro that went from 1080p -> native 4k. That doesn't make the 'GPU power=4x' anymore than the games that did 1080p->1080p makes it 1x. The methodology is flawed, period.

To get anywhere near 2x increase - it took a substantial amount of pipeline massaging and Pro specific optimisations - and then you still had to contend with things that just didn't scale with Flops at all. Flipside - there were also parts of pipeline that could see more than 3x increase (well beyond what was on paper) - but this was usecase dependant, and there were no guarantees.
You can blame the design - but the whole point is that compute-scaling is a short-hand for a lot of things under-the hood that need to happen to accomodate said compute throughput increase (whether in hw or sw). Just like one could point to X1 'underperforming its paper spec relative to PS4 (or PS4 overperforming - wherever you put your stick really). There was a lot more of a difference between two GPUs than just the compute delta.

Ultimately my point was - with PS4 Pro - the 'rationale' people use is take 2.3x and then try to retroactively prove that (and discard what doesn't fit). With PS5 Pro right now it's the same just in reverse (taking Sony's 1.4x and look for evidence to support it). Both are ultimately confirmation bias.
Whether the increase is actually hitting design targets or not - in the end market will speak towards that.
I just dont agree because the data doesnt point to this. I have posted this many many times but here it goes again. 10.6 tflops 6600xt vs 16.2 tflops 6800.

16.2/10.6=52% more GPU power.
Results in 54% average performance increase.

TtKOpWx.jpeg


Now PS4 Pro was indeed bottlenecked by the vram bandwidth. A mistake Cerny has made yet again. It was also bottlenecked by the jaguar CPU which is why we saw only SOTC and GOW offer 60 fps modes. But as I listed before there were plenty of games where the GPU performed like it should.

The X1X was a much better designed console since MS gave it the bandwidth they needed. You had 4x-5x increase in pixel numbers as many 900p games went on to native 4k. Jaguar CPU was the bottleneck and kept them from upping the framerate to 60 fps but no one here would dare say that the 4x increase in raw tflops and 5x increase in raw performance didnt result in 4-5x more performance. There are always going to be exceptions, but generally, more gpu power = more performance. The entire PC industry is built around this. We buy PCs based on these same metrics. If AMD releases an RDNA4 card tomorrow with 20 pre-inflated tflops that performs like a 13 tflops RDNA2 card, we would be like wtf.

Nowadays with inflated tflops numbers, its harder to compare but sony is reporting pre-inflated tflops numbers. They themselves have stated up to 45% more performance and multiple games are now showing only 30% more performance.

TBH, the why and the how doesnt matter. What matters are the results. MS fucked up with the XSX design despite having 44% more CUs that likely cost 44% more. Those 44% more CUs didnt result in equivalent performance because they fucked up the clockspeeds. Do i care why? Nah. It's a terribly engineered product and im simply applying the same logic here. The results speak for themselves. if i was cerny, and i saw these results, i wouldve gone back to the drawing board. he had the budget this time around. sony didnt limit him to $399 like they did with PS4, PS4 pro and even PS5. For $300-400 he shouldve given it more vram bandwidth, added infinity cache, and honestly do whatever it takes to make it perform like rdna2 cards did because they scaled just fine with CUs, clocks and tflops. When you make the same mistakes as MS engineers 4 years AFTER MS engineers made them it shows you learned nothing.
 

SlimySnake

Flashless at the Golden Globes
You know that isn't how it works.
05Mm54mFBkVdHt6nWNDBhVr-10..v1630529049.png
?? Your post literally shows thats how it works.

1706*960 = 1.63 million pixels.
1506*847 = 1.27 million pixels.
1280*720 = 0.92 million pixels.


To go from 720p to 960p will take 77% for GPU. to go from 720p to 847p will take 38% more power. Hence my in between comment. You are literally proving what Im saying. There is a reason why you are not getting 1440p dlss quality. You went from 1440p dlss performance to 1440p dlss balanced. because the gpu cant do much more than that.

Had the performance increased 1:1 like it does for RDNA2 GPUs like the 6800, you would be seeing it get much closer to dlss quality.
 

Mr Moose

Member
?? Your post literally shows thats how it works.

1706*960 = 1.63 million pixels.
1506*847 = 1.27 million pixels.
1280*720 = 0.92 million pixels.


To go from 720p to 960p will take 77% for GPU. to go from 720p to 847p will take 38% more power. Hence my in between comment. You are literally proving what Im saying. There is a reason why you are not getting 1440p dlss quality. You went from 1440p dlss performance to 1440p dlss balanced. because the gpu cant do much more than that.

Had the performance increased 1:1 like it does for RDNA2 GPUs like the 6800, you would be seeing it get much closer to dlss quality.
No, I mean the bolded part of your comment.
the fact that we are seeing 864p which sits somewhere in the middle means they cant even go from 720p to 900p.
 
Last edited:

Senua

Gold Member
Reviewer in my country said that Alan Wake 2 on his YT recordings looks like native 864p image, but on his TV the image quality looks comparable to 4K native.
Lol nah. The drop from balanced to performance DLSS at 4k was very noticeable for me and they're both higher than 864p internal. Only 1440 internal really trades blows with native 4k with taa, especially in motion. Below that it's a compromise.
 

Luipadre

Gold Member
Reviewer in my country said that Alan Wake 2 on his YT recordings looks like native 864p image, but on his TV the image quality looks comparable to 4K native.

Gamingtech also said that his footage does not do the PS5Pro justice.



It almost does. Dont judge this shit based on youtube, here is my impressions from other thread

"First quick impression on AW2 performance mode. Anyone who says this is not a good upgrade are either blind or dont remember how it looked on OG. I played it 2 weeks ago so i know exactly how it looked. Im playing on a 55" OLED TV and sitting like 2 meters away from the screen.

There is still some shimmering on objects yes, but turning off motion blur and film grain significantly clears that up. The image quality besides that is so much better. It looks crisp in both gameplay and cutscenes, you can see the increased fidelity everywhere because of this and the higher settings. Shadows are better, lighting looks better, ground detail and texture detail looks better. So far im really happy about how this looks and runs on a VRR screen and tbh im really impressed by PSSR because this shits internal ress is like 850p? Anyway death to FSR finally"
 
Last edited:

SlimySnake

Flashless at the Golden Globes
There is no 900p in FSR 1440p, it's 960 for quality and 847 for balanced, I would assume it's similar with DLSS and PSSR.
sigh. it doesnt matter. they can be scaled up from any resolution. they are using 864p in dragon age and alan wake. i believe star wars outlaws was also scaling up from that.

pixels are pixels. they scale up by gpu. i thought we all knew this very basic thing. i guess i was wrong.
 

GymWolf

Gold Member
Did they tested ps5 games that had bad framerate losses like returnal during the more hectic moments? That game was as locked 60 as kamala was a locked win at the election...
 

Mr Moose

Member
sigh. it doesnt matter. they can be scaled up from any resolution. they are using 864p in dragon age and alan wake. i believe star wars outlaws was also scaling up from that.

pixels are pixels. they scale up by gpu. i thought we all knew this very basic thing. i guess i was wrong.
But if you are using DLSS/FSR and shit, it would scale the same always, wouldn't it? If it's set to 1440p, you'd only get those resolutions unless it's using DRS? (960/847/720).
Edit: It looks like it's went from FSR 1440p performance on PS5 to PSSR 1440p balanced in Dragon Age.
 
Last edited:

midnightAI

Member
It is relevant. PSSR isn't automatically added to games. It is dependent on developers. If most games don't get updated to use PSSR or the implementation is bad enough where it's not much better than FSR then it's not worth it. For 90% of games the boost mode performance is what you are getting.
You said 'if' it didnt have PSSR, it does, so no point adding 'ifs' into the conversation
 

SlimySnake

Flashless at the Golden Globes
But if you are using DLSS/FSR and shit, it would scale the same always, wouldn't it? If it's set to 1440p, you'd only get those resolutions unless it's using DRS? (960/847/720).
Edit: It looks like it's went from FSR 1440p performance on PS5 to PSSR 1440p balanced in Dragon Age.
they all work the same way. DLSS, FSR, PSSR. I use them interchangeably.

Bottomline is that there is a reason why they werent able to 4k dlss performance. They dont have 100% more gpu. they dont even have enough to get to dlss quality or 960p. thats all there is to it. its basic math. nothing more.
 

TrebleShot

Member
they all work the same way. DLSS, FSR, PSSR. I use them interchangeably.

Bottomline is that there is a reason why they werent able to 4k dlss performance. They dont have 100% more gpu. they dont even have enough to get to dlss quality or 960p. thats all there is to it. its basic math. nothing more.
DLSS and PSSR are similar.
FSR ain't
 

Markio128

Gold Member
I couldn’t resist, so had a quick go of 3 games: F1 2024, Stellar Blade and FF7:R. In summary, awesome.

F1 2024. It has really nice RT effects and runs buttery smooth. The only downside is a bit of breakup here and there in the distance, but it’s only noticeable if you’re looking out for it.

Stellar Blade. I had a five minute blast from the beginning using Pro mode and it’s simply chef’s kiss.

FF7:R. Bearing in mind I played this in quality mode at 30fps on base PS5, the Pro setting is immediately transformative. It looks super sharp, and runs super smooth. Just a little bit of pop in is noticeable on some of the vegetation.

Season 9 Premiere GIF by Curb Your Enthusiasm
 

RJMacready73

Simps for Amouranth
I've just restarted Cyberpunk and have only just got past the intro basically where you get to Johnny, any mention of Pro enhanced specifics for this game? Seems all that GPU horsepower could make this game even more gorgeous than it already is and all you bastards talking about the Pro is giving me FOMO ffs I might just cave in and get one
 

winjer

Gold Member
I just dont agree because the data doesnt point to this. I have posted this many many times but here it goes again. 10.6 tflops 6600xt vs 16.2 tflops 6800.

16.2/10.6=52% more GPU power.
Results in 54% average performance increase.

TtKOpWx.jpeg


Now PS4 Pro was indeed bottlenecked by the vram bandwidth. A mistake Cerny has made yet again. It was also bottlenecked by the jaguar CPU which is why we saw only SOTC and GOW offer 60 fps modes. But as I listed before there were plenty of games where the GPU performed like it should.

The X1X was a much better designed console since MS gave it the bandwidth they needed. You had 4x-5x increase in pixel numbers as many 900p games went on to native 4k. Jaguar CPU was the bottleneck and kept them from upping the framerate to 60 fps but no one here would dare say that the 4x increase in raw tflops and 5x increase in raw performance didnt result in 4-5x more performance. There are always going to be exceptions, but generally, more gpu power = more performance. The entire PC industry is built around this. We buy PCs based on these same metrics. If AMD releases an RDNA4 card tomorrow with 20 pre-inflated tflops that performs like a 13 tflops RDNA2 card, we would be like wtf.

Nowadays with inflated tflops numbers, its harder to compare but sony is reporting pre-inflated tflops numbers. They themselves have stated up to 45% more performance and multiple games are now showing only 30% more performance.

TBH, the why and the how doesnt matter. What matters are the results. MS fucked up with the XSX design despite having 44% more CUs that likely cost 44% more. Those 44% more CUs didnt result in equivalent performance because they fucked up the clockspeeds. Do i care why? Nah. It's a terribly engineered product and im simply applying the same logic here. The results speak for themselves. if i was cerny, and i saw these results, i wouldve gone back to the drawing board. he had the budget this time around. sony didnt limit him to $399 like they did with PS4, PS4 pro and even PS5. For $300-400 he shouldve given it more vram bandwidth, added infinity cache, and honestly do whatever it takes to make it perform like rdna2 cards did because they scaled just fine with CUs, clocks and tflops. When you make the same mistakes as MS engineers 4 years AFTER MS engineers made them it shows you learned nothing.

And the curious thing is that there is faster GDDR6 on the market. Samsung makes memory that goes up to 24Gbps. That would mean 768GB/s.
I can' understand why Sony went with 18Gbps memory. It's not like it would make a 700$ console unfeasible.
Now gamers get 35% extra performance, out of 67% more compute.
 

Markio128

Gold Member
I've just restarted Cyberpunk and have only just got past the intro basically where you get to Johnny, any mention of Pro enhanced specifics for this game? Seems all that GPU horsepower could make this game even more gorgeous than it already is and all you bastards talking about the Pro is giving me FOMO ffs I might just cave in and get one
Cyberpunk hasn’t been patched unfortunately, so as far as I know only the performance has been improved by around 10% due to boost mode. I’m not sure if the resolution is capped or not on Cyberpunk, so it may not receive a boost in that respect.
 

Gaiff

SBI’s Resident Gaslighter
And the curious thing is that there is faster GDDR6 on the market. Samsung makes memory that goes up to 24Gbps. That would mean 768GB/s.
I can' understand why Sony went with 18Gbps memory. It's not like it would make a 700$ console unfeasible.
Now gamers get 35% extra performance, out of 67% more compute.
That's unpatched though. It could be closer to 45% patched.

Also, curious as to how the boost mode works. On the PS5 and PS4 Pro boost modes, the newer machines still use the same number of CUs but the higher clocks provided a nice performance boost anyway. What I'm wondering is how are these games on the Pro able to get a 30% performance increase without anyone touching anything. It seems that they're already using the extra CUs because the Pro's clocks must be similar to those of the regular PS5, so we know it's unlikely due to a clock speed increase.
 

SlimySnake

Flashless at the Golden Globes
And the curious thing is that there is faster GDDR6 on the market. Samsung makes memory that goes up to 24Gbps. That would mean 768GB/s.
thats what i have on my 3080.
I can' understand why Sony went with 18Gbps memory. It's not like it would make a 700$ console unfeasible.
Now gamers get 35% extra performance, out of 67% more compute.
profit over everything. they probably told cerny to target $399 and then sold it for $700.
 
Compared to PS5 games yes. It looks great. But i was comparing it to 4k dlss quality. As soon as i turn it on, I can tell the image has been downgraded. It's still clean but it definitely doesnt look native 4k anymore. There is also some far distance detail that doesnt resolve as well as dlss quality does.

Regardless, 4k PSSR performance is what Sony shouldve designed this thing around. 864p is 1440p dlss balanced which simply isnt good enough for 4k tvs. You can see from FF7, Stellar Blade and other games that target a base resolution of 1080p-1440p that PSSR performs way better.

Unfortunately and this is something i said over and over again in the Pro leaks thread, but a lot of these next gen only games drop to 720p on base PS5 to maintain 60 fps and there just isnt enough raw GPU power to get them to 1080p base resolution.
I do not know what DLSS games you have played, but DLSS image does not look the same in every game. In some older games (e.g. RDR2), the DLSS image quality can look much worse than the FSR.

The latest 3.7.2 DLSS has improved image sharpness, especially during motion. I was blown away when I saw how good DLSS performance (1080p internally) looks in the "horizon remaster" on PC and I cant imagine console players being unhappy with such razor sharp image quality on their TV.

There's hope that PSSR (PS5Pro) can achieve similar razor sharp image in 60fps / performance modes. The PS5Pro console has just been released and PSSR image quality will only improve.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I do not know what DLSS games you have played, but DLSS image does not look the same in every game. In some older games (e.g. RDR2), the DLSS image quality can look much worse than the FSR.

The latest 3.7.2 DLSS has improved image sharpness, especially during motion. I was blown away when I saw how good DLSS performance (1080p internally) looks in the "horizon remaster" on PC and I cant imagine console players being unhappy with such razor sharp image quality on their TV.

There's hope that PSSR (PS5Pro) can achieve similar razor sharp image in 60fps / performance modes. The PS5Pro console has just been released and PSSR image quality will only improve.
ive been playing dlss games since early 2019. dlss quality always looks better. no matter what game.

dlss/pssr 4k performance will be better than the crap ps5 owners have had to deal with. but im not talking about that. i was comparing dlss quality to performance.
 

GoldenEye98

posts news as their odd job
My issue is just the price. This is in essence a PS4 Pro for this gen...essentially boosting resolution/fps of PS5 games....but at a much higher premium than 4Pro was....

$400 in 2016 is the equivalent of $520 now. Even if they had gone $550 or $599...I think the value of this thing would feel different...
 

Fafalada

Fafracer forever
I just dont agree because the data doesnt point to this.
It totally does - you even agreed to it yourself:
Now PS4 Pro was indeed bottlenecked by the vram bandwidth.
MS fucked up with the XSX design despite having 44% more CUs that likely cost 44% more. Those 44% more CUs didnt result in equivalent performance


I have posted this many many times but here it goes again. 10.6 tflops 6600xt vs 16.2 tflops 6800.
But you're not comparing TFlops here
6800 has 60% more texture fillrate, 90% more ROPs and 90% more RT throughput, and double the ram bandwidth.
That's the point - GPU performance doesn't scale linearly with TFlops - you need to scale other parts of the processing complex.

Same to say - PS4 to X1 comparison - where in addition to 50% more TFlops, we had twice the geometry rate, twice the ROPs, nearly twice the bandwidth and 8x more Async compute units, among other things. That's also why we literally saw some PS4 games that doubled the X1 performance - albeit not as often. But the delta was also pretty consistently larger than 50% (most 900p games on X1 run substantially worse than PS4 counterparts at 1080p, and/or have worse settings).

Or indeed 1X which trounced PS4Pro head-2-head nearly every time, despite on paper, having some disadvantages (such as only half the ROPs or 16bit dual-issue).
Also PC comparisons are far less interesting in terms of utilising any GPU for all it can do - so the scaling is - simpler (but still not necessarily linear as pointed above).

It was also bottlenecked by the jaguar CPU
No it was really not. Both Pro and 1X were designed to be 4k versions of their base-spec machine. That was by design - noone was looking to sell them as '60fps versions of base machine' so the CPUs were exactly balanced for the job at hand. 30% more power than base machine meant both 'Pro' consoles basically eliminate most of the CPU related performance issues on base consoles (that's why almost all comparisons end up with nice flat lines on Pro/1X - except for games that push the GPU a bit too hard - usually on the X trying to do native 4k where it really shouldn't have).

Now - would I - as a user have liked a better CPU? Sure - but that's not what the manufacturers built to sell. Just like PS2 had a fairly shit CPU even when it launched, as it was primarily there to drive an improved version of PS1/N64 era games, just like DC. So yea it ended up being CPU bottlenecked to hell most of the gen - but so was every other console except for XBox that gen.


They themselves have stated up to 45% more performance and multiple games are now showing only 30% more performance.
But that's unpatched games - PS4Pro was stuck in 30-40% range there too.
Once we see something like Spiderman tested (which has unlocked framerates on VRR panel and can still run original resolution modes) we'll have a better baseline for comparison. And you may still be proven right - but let's see when data rolls out.
 
ive been playing dlss games since early 2019. dlss quality always looks better. no matter what game.

dlss/pssr 4k performance will be better than the crap ps5 owners have had to deal with. but im not talking about that. i was comparing dlss quality to perperformance
Unlike you, I'm new to this DLSS technology, but I've played quite a few DLSS games already since I upgraded my GPU back in August. I was very dissapointed at first because people on this forum said that DLSS quality offered better image quality than TAA, but that was not my experience. I played number of games where the TAA image quality was significantly better than DLSS quality and even DLAA. The biggest problem with the DLSS image was the sharpness fluctuation during motion. The DLSS and DLAA image looked sharp (like a downsampled image) on the still image, but during motion there was a very noticeable sharpness fluctuation. Only when I stopped moving the camera sharpness started building up to the point where it looked like a 4K image to my eyes. To fix this sharpness fluctuation, it was necessary to use DLSS balance in combination with DLDSR 2.25x (downsampling). Only then was I fully satisfied with the DLSS image quality.

As for RDR2, the DLSS implementation in this game is so bad that even the FSR image quality looks significantly better. The DLSS image in this game not only looks softer, but also uses a strong sharpening filter during motion (unlike FSR or TAA native). I guess Rockstar tried to hide the sharpness fluctuation I was talking about, but they did a very poor job.

With the latest DLSS 3.7.2 however, this sharpness fluctuation has been greatly improved, to the point where I can finally say that DLSSQ or DLAA provides a superior image compared to the TAA even without DSR. The sharpness of the static image has also improved to the point where even DLSS performance looks like a 4K image. Native TAA still looks a bit better than DLSS P (DLSSP has more noise around moving hair or grass), but in terms of sharpness, DLSS P looks great and most gamers would easily be fooled into thinking they were looking at a native 4K image.
 
Last edited:

SKYF@ll

Member
And the curious thing is that there is faster GDDR6 on the market. Samsung makes memory that goes up to 24Gbps. That would mean 768GB/s.
I can' understand why Sony went with 18Gbps memory. It's not like it would make a 700$ console unfeasible.
Now gamers get 35% extra performance, out of 67% more compute.
It seems that the PS5 Pro's $700 price tag is not only intended to prevent inflation, but also to prevent resales.
The design is likely to be to sell it for $599 a year from now and make a profit.
I think the PS5 Slim 2 will be available for $399 around that time.
What Sony needs to do before the PS6 is released is to reduce costs.
 

sachos

Member
Thats because those games are upscaling from a much higher resolution. even DLSS 4k quality >>>> DLSS 1440p performance because 4k quality reconstructs from 1440p while dlss 1440p performance scales from 720p.

You NEED to up the base resolution to at least 1080p. 4k dlss performance is kinda soft but is mostly clean. anything below that and even dlss has issues. i mostly just reduce the settings or turn off ray tracing in order to play games at dlss quality or dlss balanced because even 4k dlss performance is too blurry for me.

AI is not magic. Everyone who games on PCs knows this. you need raw GPU power on top of AI upscaling. 45% more power gets you from 720p to 900p at best. they are using 864p which is maybe 35% more raw pixels. just not enough.

AW2 also has a shimmering issue even on PC related to their post processing setting which is set to Low on consoles. You need to set to high on PC because even at medium, its a shimmering mess. Problem is that even going from medium to high is a 20-25% performance increase. They have to go from low to high on the pro and there just isnt enough GPU there to increase the resolution AND this setting.

to be fair, i had to play this game at 40 fps locked because that setting pretty much killed my framerate and without it, the game looked like a shimmering mess so even on PC its a tradeoff. at least on pc i was able to adjust this setting manually, cant do that even on the quality mode on the pro. again, not enough gpu power.
Yeah i know, i think im just hardstuck on that AW2 TechPowerUp benchmark that seemingly shows the 7700XT doing way better than what Remedy is showing for the PS5 Pro (although we dont know what are they used for testing). Really want to see DF compare PC GPUs vs the Pro with this game and see what they find. Maybe it is underperforming like the Pro is in Elden Ring vs that RX 6800.
 

Radical_3d

Member
Aren’t we overdue for a dive in on some of the games? The pro week has been just one impressions video so far. Am I being a slaver for demanding more hard work?
christoph waltz nod GIF
 
1st impressions:

  • It's a lot smaller than I expected. Nice!
  • Controller feels smaller, too, or certainly lighter? The D-pad feels a little cheaper. Less movement there I feel.
  • Oh shit, the visuals for DA Performance Mode are so much better! I am really pleased with what I'm seeing here.

:)
 

DenchDeckard

Moderated wildly
m


I was just giving you shit but you'll turn it into endless drama anyway. Pro ain't for you. That's all you have to say. That ain't trolling.

Now going from "not for me" to this.....

MWTTwV1.png



Astro Bot all over again.

Sarcastic Sam Smith GIF by Apple Music

I adored astro bot just was very upset by how they made horizon the last game ( I hate that franchise and couldn't believe they used that ad the pinnacle for Sony games over the last 30 years!) But anyway. All out of my system now still love that game and it's my second place game of the year.

Ps5 pro is actually really want it. Just not at the price it is. If it was 699 with everything included i would have 100% bit.

Just waiting for a sale. Yes I can exaggerate towards Sony when complaining sometimes due to my disdain for their most hardcore fans but you guys know this now and I am trying to be more detailed with my criticisms. It's not like I don't criticise other companies. I feel Sony has made a lot of slip ups this gen which level headed posters highlight too. Others just shout about how much they are winning by and they are so amazing and that concerns me as they literally accept anything out of pocket.
 
Last edited:
Top Bottom