BlackTron
Member
Now they have the hardware, writing a software imagine reconstruction algorithm that performs better than DLSS is trivial.
Trivial? Really?
Now they have the hardware, writing a software imagine reconstruction algorithm that performs better than DLSS is trivial.
Don't need to wait. 45% better rendering and 28% more bandwidth won't allow anything to claw back 200% better rendering and 125% more bandwidth. Even now, you have games like TLOU Part I where the 4090 blows past 90fps at native 4K despite being horribly optimized on PC whereas the Pro needs to upscale from 1440p to hit 60.We will see once games like wolverine and death stranding 2 release on both ofcourse the pc upgrade is never ending so eventually you will brute force 100+ frames but the spending is endless too lol
There's no point in comparing a monster $1600-2000 GPU to a console.
Not saying it's impossible - but we're talking 300 INT8 TOPs hardware which is like - double what Turing series did, and even double the likes of 3070, and well above 3080. It doesn't look like it would need inference optimization on that scale given what NVidia got out of those ops.definitely looks to be doing what the patent we looked at described, meaning if DF are pixel counting, then it is the cheaper non-AI inference hole filing they are counting at those resolution numbers.
What can I say? That bespoke crate generation chip on the Pro was no exaggeration. Cerny meant business.And yet the monster fails the very simple time to crate metric popularized by Old Man Murray.
What can I say?
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.Another predictably dishonest take from you. Now 120fps and 60fps is a tie because of bullshit reasons you made up.
I don't get it. I'm getting the PS5 Pro because I like PS5 and want to play it in the best light, not because I seriously think it is going to compare to a GPU that is in a PC build that is a good four to five times more expensive.Don't need to wait. 45% better rendering and 28% more bandwidth won't allow anything to claw back 200% better rendering and 125% more bandwidth. Even now, you have games like TLOU Part I where the 4090 blows past 90fps at native 4K despite being horribly optimized on PC whereas the Pro needs to upscale from 1440p to hit 60.
The PS5 Pro will do very well, possibly even better than something like an RTX 4070 in some titles and even better than an RTX 4070S in first-party ones, but the 4090 is twice the power of the 4070, which the Pro is often compared to (and the 4070 on paper at least is a chunk above). The Pro beating the 4090 would be akin to the regular PS5 beating the 3090, which is wholly impossible.
This talk is seriously asinine. There's no point in comparing a monster $1600-2000 GPU to a console. Yes, it's the shiny new toy, yes, it does cool stuff, but let's not turn into idiots and believe in fantasies.
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.
Depends on how stable the framerate is, wild swings still feel bad even with VRR, you can detect the unevenness of the frame pacing. It feels really bad to drop from 120 to 80, for example.With vrr there is no problem with frame rate between 60-120fps, it's better than locked 60.
Check out Sony games like Gow Ragnarok or uncharted that have unlocked frame rate above 60. Less input lag and it's smoother.
This entirely depends on the game and has nothing to do with not being able to hit 120fps consistently. You can have an fps cap and hit 60 consistently without having an even frame pacing. Frame rate is merely the number of frames per second, but our eyes perceive many more than that. You can have 60 frames every second, but it doesn't mean that they're delivered every 16.66ms. You could have 1 frame at 8.33ms and then another at 33.32 and it could be like that for a whole second and repeat consistently. Your frame rate would still be 60fps, but it would be stuttery as hell. With VRR, this is massively mitigated. For instance, GOWR on PS5 has an unlocked high frame rate mode that generally hovers around 70-90fps. Doesn't come close to 120, but it's still incredibly smooth with very even frame times and no massive spikes.Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.
This entirely depends on the game and has nothing to do with not being able to hit 120fps consistently. You can have an fps cap and hit 60 consistently without having an even frame pacing. Frame rate is merely the number of frames per second, but our eyes perceive many more than that. You can have 60 frames every second, but it doesn't mean that they're delivered every 16.66ms. You could have 1 frame at 8.33ms and then another at 33.32 and it could be like that for a whole second and repeat consistently. Your frame rate would still be 60fps, but it would be stuttery as hell. With VRR, this is massively mitigated. For instance, GOWR on PS5 has an unlocked high frame rate mode that generally hovers around 70-90fps. Doesn't come close to 120, but it's still incredibly smooth with very even frame times and no massive spikes.
Bloodborne hits 30 consistently but has awful frame pacing so it feels terrible anyway. Not being able to hit 120 consistently in no way means you will have bad frame pacing and it's certainly not better to lock your fps to 60 if you have a VRR display,
Edit: Hell, Rift Apart actually has an unlocked frame rate mode where it uses DRS and sits between 1080p-1440p and generally around 80fps. Still plays well without stutters or frame time spikes.
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.
wrong again. Here's DF on why vrr doesn't save the day in the exact scenario I'm describing, inconsistently delivered frames.
You keep taking things I say and making the dumbest generalizations about them. when i say 60 fps solid, OF FUCKING COURSE I mean perfectly frame timed apart from one another. Why would I mean anything differently.
Then stop saying nonsense. The video you linked specifically mentions if the game has bad frame pacing to begin with, VRR won't save you from that which I perfectly described. If the game has bad frame pacing, a 60fps cap also won't save you. In this specific debate, Rift Apart doesn't have bad frame pacing so there is absolutely no issue with going above 60fps without hitting 120. However, if you try that without VRR, you will get screen tearing and stutters.Do I look like a bitch to you?
Then stop trying to fuck me like one.
If the frame pacing is good, yes. In a game like Jedi Survivor, it would still suck.a framerate from 90 to 120 fps with VRR is perceptibly nearly perfectly smooth
If the frame pacing is good, yes. In a game like Jedi Survivor, it would still suck.
This entirely depends on the game and has nothing to do with not being able to hit 120fps consistently. You can have an fps cap and hit 60 consistently without having an even frame pacing. Frame rate is merely the number of frames per second, but our eyes perceive many more than that. You can have 60 frames every second, but it doesn't mean that they're delivered every 16.66ms. You could have 1 frame at 8.33ms and then another at 33.32 and it could be like that for a whole second and repeat consistently. Your frame rate would still be 60fps, but it would be stuttery as hell. With VRR, this is massively mitigated. For instance, GOWR on PS5 has an unlocked high frame rate mode that generally hovers around 70-90fps. Doesn't come close to 120, but it's still incredibly smooth with very even frame times and no massive spikes.
Bloodborne hits 30 consistently but has awful frame pacing so it feels terrible anyway. Not being able to hit 120 consistently in no way means you will have bad frame pacing and it's certainly not better to lock your fps to 60 if you have a VRR display,
Edit: Hell, Rift Apart actually has an unlocked frame rate mode where it uses DRS and sits between 1080p-1440p and generally around 80fps. Still plays well without stutters or frame time spikes.
Most PS5 exclusives have a high frame rate mode with VRR displays at 120Hz that hits above 60fps but way below 120 and ais often the smoothest experience.
This.VRR does not help alleviate frame time spikes. It only eliminates screen tearing which reduces the perceptual window where inconsistent frametimes become bothersome. A steady 60fps is more pleasant than constantly bouncing between 80-120 fps, and a steady 120fps is better than a steady 60fps.
I should mention that going from 80-120fps is better than a steady 60fps if the change is gradual. But you can really feel it when the frametimes vary by more than 2-3ms between frames, even with VRR enabled.This.
I think the one thing PS5 Pro might have going for it is a better optimised experience in some games but yeah, comparing a Pro to a 4090 on any level is ridiculous.Don't need to wait. 45% better rendering and 28% more bandwidth won't allow anything to claw back 200% better rendering and 125% more bandwidth. Even now, you have games like TLOU Part I where the 4090 blows past 90fps at native 4K despite being horribly optimized on PC whereas the Pro needs to upscale from 1440p to hit 60.
The PS5 Pro will do very well, possibly even better than something like an RTX 4070 in some titles and even better than an RTX 4070S in first-party ones, but the 4090 is twice the power of the 4070, which the Pro is often compared to (and the 4070 on paper at least is a chunk above). The Pro beating the 4090 would be akin to the regular PS5 beating the 3090, which is wholly impossible.
This talk is seriously asinine. There's no point in comparing a monster $1600-2000 GPU to a console. Yes, it's the shiny new toy, yes, it does cool stuff, but let's not turn into idiots and believe in fantasies.
Sssshh, that's how these companies make moneyMan between the PS5 pro and switch 2 threads, you can really tell who has no idea what's so ever how hardware and software work
Compare graphic with the 4090 is fool's errand. Most poster here largely mixed up graphic setting with upscaling characteristics. Fur, AF, raytracing, crowd size and etc. are graphic settings.Taking a closer look at this comparison shot, The PS5 Pro Version has an additional Ray Traced Shadow to the right of the frame (green arrow), as well as additional foliage on the foreground (yellow arrow). The anti aliasing on the 4090 (Purple Arrows) is also of lower quality in comparison to the Pro.
So Nvidia who IS powering the data centres of the world and who's deep learning knowledge through their engineers, software and tech, who kicked off the whole AI upscale boom and who have been solely about graphics since their inception have somehow been upstaged by hitherto unseen, untested and atm unknown by the public on a technical level AMD magic sauce for Sony.
Give your head a shake. And Nvidia and their technology, software and engineers are powering the AI revolution you muppet, what algorithms and deep learning models don't use tensor? You know as much about how PSSR works and performs as Kermit the frog and everyone else here on the forum, including me. Nada. But we do know Nvidia are out there with proven tech we can test and see now and what they've been updating and advancing for several years.
Will PSSR be better than DLSS, we don't know but it's highly unlikely due to the time lead of DLSS with powerful help from the tensor cores and several years of their engineers improving the tech.
Motion flow and checkerboard knowledge and some wizard engineers will help but you seriously think they will usurp a technology that's several iterations deep and has the most powerful AI upscale technologies behind it. Lol.
I have a PS5 and will be buying a Pro, I also have a 4090, I love games and hope PSSR is fantastic but I also don't go for childish warring. PSSR will not beat DLSS on all the evidence available, to you and me.
Trivial? Really?
What base resolution are they using here on the 4090? We know AI upscaling quality is defined by the native resolution (like the others upscaling techniques). We'll have definite answers when we'll know for sure both are using the same native resolution. If we want to judge the quality of their AI upscaling objectively obviously. If you want to judge the power of the GPU, well just use a native resolution on PC will settings set to max and a 4090 will obviously beat a 4070 like GPU.Here, there is a clear difference.
DLSS is better here. The lines on the floor around the feet of the Goon-4-Less soldier are just gone on the Pro, but this could be due to the much higher AF on PC (4x on PS5). Otherwise, DLSS is simply way better at reconstructing details. Ratchet's body is much more defined, the strands of fur on his tail are more visible, and everything just looks cleaner and higher res with DLSS. I honestly thought this was DLAA, but given the context of the video, it's more likely DLSS Quality.
Yeah, so exactly like I said? However, if you try that without VRR, you will get screen tearing and stutters.VRR does not help alleviate frame time spikes. It only eliminates screen tearing which reduces the perceptual window where inconsistent frametimes become bothersome.
No, this is completely false. PlayStation has many games with unlocked fps and they often bounce around between 70-90 and are often described as the best performing and feeling mode. What year is this, 2008? You have people on this very site playing at high frame rates in tons of games and almost none of them locks the fps to 60. They let it go above without necessarily hitting 120. Not hitting 120fps does NOT mean you will get massive frame time spikes that will result in terrible stutters and a bad experience. As I said, bad frame times will happen regardless of whether or not a given cap is consistently hit. Jedi Survivor and Bloodborne are prime examples of this.A steady 60fps is more pleasant than constantly bouncing between 80-120 fps, and a steady 120fps is better than a steady 60fps.
The reality, however, is that we're mostly looking at a performance between 80 to 90 frames per second in most scenarios. It can occasionally jump above and below this point, of course, but by and large, this is what you'll get during gameplay. Compared to 60fps, it's a significant jump, allowing more responsive and fluid gameplay. However, there's a catch - while you can enable this mode on any 120hz capable display, if you cannot utilize VRR, I would strongly suggest sticking with 60fps instead due to judder. Of course, if you can use VRR, this quickly becomes my preferred graphical mode. It really shines at such a high frame-rate and it's unlikely we'd have had access to this option if not for the fact that it's a cross-gen release.
The 4090 would have a base resolution of 1440p and upscale to 4K. And of course, it's better than DLSS1 and current FSR for that matter.What base resolution are they using here on the 4090? We know AI upscaling quality is defined by the native resolution (like the others upscaling techniques). We'll have definite answers when we'll know for sure both are using the same native resolution. If we want to judge the quality of their AI upscaling objectively obviously. If you want to judge the power of the GPU, well just use a native resolution on PC will settings set to max and a 4090 will obviously beat a 4070 like GPU.
But one thing is sure, if we start needing 4x zooms to see a difference, it means PSSR is doing pretty good and likely much better than DLSS1
I just checked the Ratchet comparisons. They are comparing DLAA at native 4k vs PSSR at about native 1440p and said DLSS is still "king". Oh you think they were not doing to do that kind of dishonest comparison?...
The 4090 would have a base resolution of 1440p and upscale to 4K. And of course, it's better than DLSS1 and current FSR for that matter.
I think (with the posters later clarification) that 80-120fps doesn't feel great if the FPS is bouncing between those two extremes with regularity, as in within seconds of each other. But something like that is quite rare, usually the fps shouldn't fluctuate that bad and the difference in fps should be far more gradual.No, this is completely false. PlayStation has many games with unlocked fps and they often bounce around between 70-90 and are often described as the best performing and feeling mode. What year is this, 2008? You have people on this very site playing at high frame rates in tons of games and almost none of them locks the fps to 60. They let it go above without necessarily hitting 120. Not hitting 120fps does NOT mean you will get massive frame time spikes that will result in terrible stutters and a bad experience. As I said, bad frame times will happen regardless of whether or not a given cap is consistently hit. Jedi Survivor and Bloodborne are prime examples of this.
This is how John describes the high frame rate mode in GOWR
The original DF comparison was DLSS Quality vs PSSR, and PSSR was actually rendering at the higher resolution thanks to DRS. I think the comparison there was done to death with the conclusion being that DLSS is slightly better, but the image quality being close overall.I just checked the Ratchet comparisons. They are comparing DLAA at native 4k vs PSSR at native 1440p and said DLSS is still "king". Oh you think they were not doing to do that kind of dishonest comparison?
In the IGN video? It’s definitely DLSS. If it’s DLAA, they say as much. 22:35.I just checked the Ratchet comparisons. They are comparing DLAA at native 4k vs PSSR at native 1440p and said DLSS is still "king". Oh you think they were not doing to do that kind of dishonest comparison?
This would make more sense, but most games do not have such massive load shifts where the frame rates suddenly and constantly goes up to 120 and then down to 80. Your frame rates will usually fall within a certain window.I think (with the posters later clarification) that 80-120fps doesn't feel great if the FPS is bouncing between those two extremes with regularity, as in within seconds of each other. But something like that is quite rare, usually the fps shouldn't fluctuate that bad and the difference in fps should be far more gradual.
that is one of the most Dunning Kruger screaming pieces of text I´ve come across in quite a while....DLSS is an algorithm. How it works is not hidden to anyone. All the major technologists in this field working on image reconstruction tech know how it works. Just as they know how FSR works and they'll eventually know how PSSR works once it's out in the wild and the documentation is in the hands of devs.
There's nothing specifically special about looking at an algorithm, understanding how it works and therefore how to make it better. Any engineer worth their socks is able to do this. What has held back FSR up until now hasn't been the software algorithm part of things, rather the lack of dedicated hardware support in silicon, e.g. tensor cores.
With PS5 Pro, AMD has closed the gap with the dedicated hardware support for AI computation (i.e. low precision matrix math arrays with large registers and a reasonable amount of on-die cache). Now they have the hardware, writing a software imagine reconstruction algorithm that performs better than DLSS is trivial.
It would not surprise me if DLSS is beaten by PSSR, and then the next iteration of DLSS comes out soon after and beats PSSR. Technology is always evolving, and what is implemented in actual processors in the wild is nothing close to the cutting edge in the research domain. It ALWAYS lags behind.
In the IGN video? It’s definitely DLSS. If it’s DLAA, they say as much. 22:35.
There was another video where they indeed compared DLAA with PSSR in the OP.
This would make more sense, but most games do not have such massive load shifts where the frame rates suddenly and constantly goes up to 120 and then down to 80. Your frame rates will usually fall within a certain window.
Not sure about the difference between DLAA 4K and DLSS using a native 4K. But Ratchet is running at native 4K on the 4090 on the DF x IGN video (while it's about 1440p native on PS5 Pro). Which easily explains why there are more details on the leaves, Ratchet and elsewhere. native 4K is 125% more native pixels than 1440p. This is 4090 sheer power talking here, not quality of AI upscaling. And I bet they also used DLSS with a native resolution of 4K in all the others comparisons like in Horizon.
Forget DLAA. I see you have been tricked too by all their editorial ploysHe repeatedly says DLSS though and the tag says DLSS. They never use DLAA and DLSS interchangeably. DLAA is native resolution with AA applied on top, it’s not upscaling, which is what DLSS is. They’re comparing the different solutions and DLAA and DLSS aren’t the same thing.
It would be a first if they tried to pass off DLAA as DLSS.
Then it’s not DLSS and is a blatant lie assuming you are correct. DLAA is DLSS without the upscaling part, y’know, the part where they reduce the input resolution. That’d be quite shocking as DF often uses DLAA in comparison shots and always say it’s DLAA.Forget DLAA. I see you have been tricked too by all their editorial ploys
I just pixel counted the Ratchet footage and they somehow used a native resolution of native 4K with DLSS based on the Ratchet comparison I timestamped. Do you understand the trickery? They cleverly never say DLSS quality (so from 1440p) here of course. They set DLSS with a native resolution of 4K so they could say "DLSS" and make people believe the comparison was honest. But it's obviously not.
Correction here, DLSS is completely black-boxed. Nobody outside Nvidia knows how it works exactly. While NVIDIA has explained the general principles of how DLSS works, and it is known to utilise temporal upscaling, the specific algorithms, training data, and model details are proprietary.DLSS is an algorithm. How it works is not hidden to anyone.
Are you sure you're counting the input and not the output resolution? You would only be able to get the input resolution on elements that are not upscaled.I just pixel counted the Ratchet footage and they somehow used a native resolution of native 4K with DLSS based on the Ratchet comparison I timestamped. Do you understand the trickery? They cleverly never say DLSS quality (so from 1440p) here of course. They set DLSS with a native resolution of 4K so they could say "DLSS" and make people believe the comparison was honest. But it's obviously not.
Technically DLAA does use upscaling to reach the higher than native resolution, which then gets downsampled. It's like SSAA with an upscaled image as the base.Then it’s not DLSS and is a blatant lie assuming you are correct. DLAA is DLSS without the upscaling part, y’know, the part where they reduce the input resolution. That’d be quite shocking as DF often uses DLAA in comparison shots and always say it’s DLAA.
Edit: What about Horizon?
Isn't that technically downscaling?Technically DLAA does use upscaling to reach the higher than native resolution, which then gets downsampled. It's like SSAA with an upscaled image as the base.
The patent I read explicitly mentioned VR (ie high frame-rate, low latency local real-time graphics), and the visualization @ 33:30 in the state of play is intentionally brief and rolling off axis IMHO to visualize accurately but also continue to obscure the exact inner workings of the algorithm to non-PSSR engineers. The visualization also appears to be an in place technique, rather than a scaler like the hole filling patent describes, and IIRC no one from Sony or leaks has described it as an ML upscaler.Not saying it's impossible - but we're talking 300 INT8 TOPs hardware which is like - double what Turing series did, and even double the likes of 3070, and well above 3080. It doesn't look like it would need inference optimization on that scale given what NVidia got out of those ops.
Besides - I still say that patent was describing reconstruction of remote-rendered video-streams, not locally rendered content, ie. it would be aimed at very compute constrained devices (not just AI, but also general compute).
3080 has around 500 INT8 TOPS. The Pro, based on RDNA4, is utilising sparsity for that TOPS figure. The Pro is close to a 3060Ti in terms of TOPS.Not saying it's impossible - but we're talking 300 INT8 TOPs hardware which is like - double what Turing series did, and even double the likes of 3070, and well above 3080. It doesn't look like it would need inference optimization on that scale given what NVidia got out of those ops.
Besides - I still say that patent was describing reconstruction of remote-rendered video-streams, not locally rendered content, ie. it would be aimed at very compute constrained devices (not just AI, but also general compute).
As far as I know DLAA is basically regular TAA with a ML model on top of it to enhance image quality. Hence the cost being only a few percent higher then regular TAA.Technically DLAA does use upscaling to reach the higher than native resolution, which then gets downsampled. It's like SSAA with an upscaled image as the base.
You're right, I was confusing DLDSR with DLAA. DLAA is DLSS with the input and output resolutions the same, while DLDSR upscales to higher than native output resolutions.As far as I know DLAA is basically regular TAA with a ML model on top of it to enhance image quality. Hence the cost being only a few percent higher then regular TAA.
It would be more fair to say ”this version of pssr” instead of ”this chip” since we can assume pssr will continue to evolve much like dlss even on this chip.Possibly, but based on this clip at least, DLSS is a cut above. It's specifically here, however. In other games, PSSR seemed really good.
But that 500TOPs in terms of game use is theoretical AFAIK, the minute the card needs to do game rendering, and RT that TOPs number becomes far, far smaller and less efficient was how I understood it by the split of SMT to slower bus setup for RT - whereas the Ragnarok AI ML solution looks completely asynchronously integrated to gaming workloads on RDNA.3080 has around 500 INT8 TOPS. The Pro, based on RDNA4, is utilising sparsity for that TOPS figure. The Pro is close to a 3060Ti in terms of TOPS.
..
It would be exactly double, so 476 TOPS using sparsity, which was only introduced with Ampere. PS5 Pro also uses sparsity?3080 has around 500 INT8 TOPS. The Pro, based on RDNA4, is utilising sparsity for that TOPS figure. The Pro is close to a 3060Ti in terms of TOPS.
Yes.
You'd be surprised how simple it is to improve an algorithm when you know how it works and you're specifically designing the underlying hardware to advance it.
Lay people on NeoGaf think this shit is rocket science. It's surprisingly simple when you understand the math behind it.