• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

Edder1

Member
I beg to differ. You need path-tracing for good approximation to light sources especially using environment HDR textures that need to be evaluated across a differential area. If you have to ray-trace then you are using standard light loops with one ray cast. But that's getting away from the initial argument.

Firing any ray is going to require memory and bandwidth - which the consoles and PC GPUs don't have at the moment. Games will require implementation like Nvidia's Marble demo (which looks better than that Matrix demo) in order to look on par with CG of today. CG 10yrs ago isn't saying much really.
Disagree. Movies like Toy Story 3 and Rango among many others don't use path tracing and they look absolutely stellar even today. Path tracing is better of course, but RT GI does a good enough job.
 

VFXVeteran

Banned
Disagree. Movies like Toy Story 3 and Rango among many others don't use path tracing and they look absolutely stellar even today. Path tracing is better of course, but RT GI does a good enough job.
You are boasting about RT GI but have only Metro to represent the consoles. It's such a crude approximation with limited samples being taken that it's completely mitigated. To compare an incredibly lit movie like Rango to any of these games is actually an insult. That Matrix demo doesn't come anywhere near the quality of Rango.

You have to keep in mind all the things that go into rendering offline that is just not happening on any future console hardware in the next 20yrs.

Significant rendering of curve primitives with vertex counts exceeding 20 or more (you have to evaluate the light loop for every vertex).
You need to take multiple casts to a single light source for soft shadows
You absolutely need a consistent FB resolution throughout the entire pipeline.
FX is simply way way off
Geometry detail needs to be significantly increased ALONG WITH normal maps on top of the geo.
Textures need to be 8k and they need to be for each part of a person. For example, 8k texture for head, 1 for chest, 1 for legs, etc..
Transparency is a big problem in gaming and it has to be solved if you want good FX such as water
Materials need to be more robust than just the bare minimum evaluations. Hair shaders are lacking at least 2 specular lobes to be convincing.
Layered textures and materials on a single asset is prohibitive for realtime GPUs, the bandwidth just isn't there.

I could go on an on but the jist is that consoles/GPUs just don't evolve fast enough with enough power to approach offline rendering of today. And if it does, it will happen with PC GPUs before consoles.
 
Last edited:

Edder1

Member
aaaaand we`re back to square one..

You`ve understood literally nothing I´ve said.
Except you keep repeating what I said and try to use it against me. You keep saying "extremely limited scope" when I already said from the start "in limited scenes".

If you think next gen consoles won't do what current gen consoles can already do under limitation then you're fooling yourself. One just has to look at original Last of Us reveal on PS3 that was rendered offline (not even real time) and PS4 sequel looked a whole generation better than that in real time. But like I said, you can pretend like there won't be a significant technological leap anymore.
 

Edder1

Member
You are boasting about RT GI but have only Metro to represent the consoles. It's such a crude approximation with limited samples being taken that it's completely mitigated. To compare an incredibly lit movie like Rango to any of these games is actually an insult. That Matrix demo doesn't come anywhere near the quality of Rango.

You have to keep in mind all the things that go into rendering offline that is just not happening on any future console hardware in the next 20yrs.

Significant rendering of curve primitives with vertex counts exceeding 20 or more (you have to evaluate the light loop for every vertex).
You need to take multiple casts to a single light source for soft shadows
You absolutely need a consistent FB resolution throughout the entire pipeline.
FX is simply way way off
Geometry detail needs to be significantly increased ALONG WITH normal maps on top of the geo.
Textures need to be 8k and they need to be for each part of a person. For example, 8k texture for head, 1 for chest, 1 for legs, etc..
Transparency is a big problem in gaming and it has to be solved if you want good FX such as water
Materials need to be more robust than just the bare minimum evaluations. Hair shaders are lacking at least 2 specular lobes to be convincing.
Layered textures and materials on a single asset is prohibitive for realtime GPUs, the bandwidth just isn't there.
In gaming only Metro, but when it comes to CGI there have been dozens of examples that used RT GI before path tracing was used and they still look awesome today. You making it sound like CGI cannot look respectable without path tracing while there are tons of examples to prove you otherwise.
 
Last edited:

Haggard

Banned
Except you keep repeating what I said and try to use it against me. You keep saying "extremely limited scope" when I already said from the start "in limited scenes".

If you think next gen consoles won't do what current gen consoles can already do under limitation then you're fooling yourself. One just has to look at original Last of Us reveal on PS3 that was rendered offline (not even real time) and PS4 sequel looked a whole generation better than that in real time. But like I said, you can pretend like there won't be a significant technological leap anymore.
yep, you really didn`t understand a thing....
all you do is look at a pretty picture and say "yep, I´m getting that next gen" without the slightest understanding what you`re looking at.
 
Last edited:

Hunnybun

Banned
I dont know about that. The first UE5 demo and the Valley of the Ancient demo features a lot of Cliffs and rocks, and it looks a gen ahead of Horizon's cliffs and rock. Look at the draw distance and asset quality vs what sacrifices GG had to make. So much fog. So blurry. And this is from a native 4k trailer.

Nqsds89.gif


rYg9Qdz.gif


SE7CzsQ.gif


It currently cannot do foliage but they said it will eventually be supported.

I think you must've misread my post. I was saying that those environments are exactly what Unreal 5 is suited for.
 

Edder1

Member
The matrix demo is far from photorealism.
Lol, by that logic real life shots from the movie that were recreated on current gen consoles were not photorealistic enough. If you follow the discussion then you'd know I was talking about couple of recreated shots from the movie and not the whole demo.
 
Last edited:

Haggard

Banned
And you don't know a thing.
The usual rebuttal of people who don`t know shit and refuse to learn because that would put them out of their comfort zone.
Just stay ignorant as that seems to be your target anyways.

If you wish to educate yourself, a good start would be Epic`s own engine deep dives.
 
Last edited:

Haggard

Banned
Says someone who couldn't prove a thing so he started insulting others. How utterly pathetic.
Jesus....you are too dumb to realise just how dumb your are. You haven`t understood a single sentence of those last 2 pages. Well done going from "better than cgi" to "better than 10 year old cgi" btw. Really shows your deep understanding and discussion capabilities.....

Oh well, we´re actually going into insult territory now. ignored.
 
Last edited:

Edder1

Member
Jesus....you are too dumb to realise just how dumb your are. You haven`t understood a single sentence of those last 2 pages.

Oh well, we´re actually going into insult territory now. ignored.
More insults from a clown like you. Should have known that you're an idiot the moment you started using emojis when you couldn't counter argue.
 

Lethal01

Member
Lol, by that logic real life shots from the movie that were recreated on current gen consoles were not photorealistic enough. I you follow the discussion then you'd know I was talking about couple of recreated shots from the movie and not the whole demo.
Yes, that's what I was referring to, the recreated shot are all clearly fake looking in comparison to reality.
 

Edder1

Member
Yes, that's what I was referring to, the recreated shot are all clearly fake looking in comparison to reality.
Show them to someone who doesn't spend all day on internet forums and they won't be able to tell that it's not from a movie. Side by side there is a difference, but that difference is negligible and good enough to fool an average viewer.
 
Last edited:

VFXVeteran

Banned
In gaming only Metro, but when it comes to CGI there have been dozens of examples that used RT GI before path tracing was used and they still look awesome today. You making it sound like CGI cannot look respectable without path tracing while there are tons of examples to prove you otherwise.
I'm not saying that at all. I'm saying that even with RT on the level of the CGI requires enormous bandwidth which no GPU has today. And no, the PS6 won't be close either. The consoles lag behind the PC GPUs by a significant margin, so we should be looking at GPUs from Nvidia/AMD instead of consoles. I'm looking for VRAM that's at least 64G and bandwidth that's plentiful where these games can be run easily at native 4k/60FPS with all the options turned up high and no reconstruction techniques. We won't be there with the next iteration of the consoles for sure. We'd be lucky to get a PS6 to have the capability of a 3090 today. And a 3090 is way off from CGI (with it's bandwidth) requiring DLSS to even run at a reasonable FPS.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Jesus....you are too dumb to realise just how dumb your are. You haven`t understood a single sentence of those last 2 pages. Well done going from "better than cgi" to "better than 10 year old cgi" btw. Really shows your deep understanding and discussion capabilities.....

Oh well, we´re actually going into insult territory now. ignored.
More insults from a clown like you. Should have known that you're an idiot the moment you started using emojis when you couldn't counter argue.
This is not a good look for either of you. Take a deep breath and lets move on.

You just might find other things to agree on. No point in burning bridges over a difference of opinion.
 

Edder1

Member
I'm not saying that at all. I'm saying that even with RT on the level of the CGI requires enormous bandwidth which no GPU has today. And no, the PS6 won't be close either.
This is really hard to believe when a 4 tflop Series S with paltry bandwidth can do RT GI in Metro Exodus at 60fps.

Maybe RT GI is hard to do as things stand now, but once the whole framework shifts to RT rendering then hardware should be much better optimised for it. Don't forget, next gen consoles should also have way beefier RT hardware.
 
Last edited:

Haggard

Banned
No point in burning bridges over a difference of opinion.
No point investing any time in pointless internet discussions either (even though i forget that time and time again). Extensive use of Ignore lists is a good thing to do.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
No point investing any time in pointless internet discussions either (even though i forget that time and time again). Extensive use of Ignore lists is a good thing to do.
Yeah, i use ignore function a lot. But I have found that a guy Im sharing insults with one day might actually not be the worst person in the world on a different day.

It's hard to remember that when you are angry and have resorted to calling each other pieces of shit lol but i just wanted to offer an outsider's perspective and cool things down a bit.

It's all good.
 

VFXVeteran

Banned
This is really hard to believe when a 4 tflop Series S with paltry bandwidth can do RT GI in Metro Exodus at 60fps.

Maybe RT GI is hard to do now, but once the whole framework shifts to RT rendering then hardware should be much better optimised for it.
You think that's hard to believe because you don't know what's going on under the hood.

I can easily cast a single reflection ray, go through the intersection tests and get a color from an object - put it in a framebuffer and blur it so I can boast having blurred reflections. But that's not the way to do it like film. In film, I would cast several rays around according to the specular lobe that I'm mimicking and then take these multiple rays and apply an AA algorithm on it to prevent noise. That costs significantly more compute power but looks way better and more accurate.

My point is that the RT GI used by Metro for the lower end consoles is a bare minimum evaluation and simply will never look good on the scale of a CGI movie.
 

Edder1

Member
You think that's hard to believe because you don't know what's going on under the hood.

I can easily cast a single reflection ray, go through the intersection tests and get a color from an object - put it in a framebuffer and blur it so I can boast having blurred reflections. But that's not the way to do it like film. In film, I would cast several rays around according to the specular lobe that I'm mimicking and then take these multiple rays and apply an AA algorithm on it to prevent noise. That costs significantly more compute power but looks way better and more accurate.

My point is that the RT GI used by Metro for the lower end consoles is a bare minimum evaluation and simply will never look good on the scale of a CGI movie.
But we're also taking about RT hardware a whole generation from now, you can't look at RT in current GPU market and use that as measure for how things will be in 6-7 years from now. RT hardware should significantly improve by the time next gen consoles arrive. Heck, even first gen RT hardware from AMD that's a whole generation behind Nvidia is able to do RT GI in a AAA game. It's hard to believe that in 2028 consoles won't be able to do what Pixar did back in 2007 (RT GI implementation).
 
Last edited:

VFXVeteran

Banned
But we're also taking about RT a whole generation from now, you can't look at RT in current GPU market and use that as measure for how things will be in 6-7 years from now. RT hardware should significantly improve by the time next gen consoles arrive. Heck, even first gen RT hardware from AMD that's a whole generation behind Nvidia is able to do RT GI in a AAA game. It's also hard to believe that in 2028 consoles won't be able to so what Pixar did back in 2007 (RT GI implementation).
2yrs ago when I told people that my sources said the PS5 will be on a 2080-tier class, everyone barked at me and trolled. Until the results were in and my source was 100% accurate. These machines don't jump significantly from generation to generation. It's better to look at the latest graphics card to see where the trajectory will be for tech moving forward.

I don't know what to tell you other than there are a lot of things that push the idea of consoles or PCs doing what CG does (even 10yrs ago) is not possible. Epic is tackling part of the issue with geometry and textures with UE5 but there are so many other elements that are important like lighting. Even Nanite can't tackle deforming meshes and that is key to having CG visuals.

We'll just have to disagree and I can sit back and wait over the years to see if what I say is true or not.
 

Edder1

Member
This is not a good look for either of you. Take a deep breath and lets move on.

You just might find other things to agree on. No point in burning bridges over a difference of opinion.
I have no problem with anyone who disagrees with me, but the moment person turns to insults is where I draw the line. If a person doesn't wanna be respectful then they don't deserve respect.
 
Last edited:

Edder1

Member
2yrs ago when I told people that my sources said the PS5 will be on a 2080-tier class, everyone barked at me and trolled. Until the results were in and my source was 100% accurate. These machines don't jump significantly from generation to generation. It's better to look at the latest graphics card to see where the trajectory will be for tech moving forward.

I don't know what to tell you other than there are a lot of things that push the idea of consoles or PCs doing what CG does (even 10yrs ago) is not possible. Epic is tackling part of the issue with geometry and textures with UE5 but there are so many other elements that are important like lighting. Even Nanite can't tackle deforming meshes and that is key to having CG visuals.

We'll just have to disagree and I can sit back and wait over the years to see if what I say is true or not.
I think 2080 level performance is a huge jump from last gen consoles.

I think we can all agree that PS5 and Series X can easily better any CGI movie/animation done 20 years before them, this is why it's hard to believe that next gen consoles (PS6/Xbox5) won't be able to do what CGI did 20 years before them. In fact if anything the gap is closing as CGi is at diminishing returns and consoles hardware still has ways to go before it gets there.
 
Last edited:

Edder1

Member
VFXVeteran VFXVeteran Forgot to mention this. You said next gen RT hardware in consoles won't be powerful enough to do RT GI in games, but 4A Games already announced that their current gen project will be moving to RT GI only. If 4A can do RT GI on a native PS5/SX game with poor first gen RT hardware from AMD then I don't see why RT GI should not become common on PS6/Xbox5 that should have way way more beefier RT solution. I think engine being built around RT makes a huge difference as is the case with 4A Games.
 

VFXVeteran

Banned
VFXVeteran VFXVeteran Forgot to mention this. You said next gen RT hardware in consoles won't be powerful enough to do RT GI in games, but 4A Games already announced that their current gen project will be moving to RT GI only. If 4A can do RT GI on a native PS5/SX game with poor first gen RT hardware from AMD then I don't see why RT GI should not become common on PS6/Xbox5 that should have way way more beefier RT solution. I think engine being built around RT makes a huge difference as is the case with 4A Games.
When evaluating what a console will do in the future, you have no where to look other than the PC. What does a PC 3090 do with RT GI? Their solution is good but it's a crude solution and they don't do specular RT GI - only diffuse. There are just so many things in the rendering pipeline to go over that it's really a waste of time to go over them in detail here. Yes, we will get RT in games but the quality is subpar compared to film and there are other things that need to be implemented that aren't in games yet.
 

Edder1

Member
When evaluating what a console will do in the future, you have no where to look other than the PC. What does a PC 3090 do with RT GI? Their solution is good but it's a crude solution and they don't do specular RT GI - only diffuse. There are just so many things in the rendering pipeline to go over that it's really a waste of time to go over them in detail here. Yes, we will get RT in games but the quality is subpar compared to film and there are other things that need to be implemented that aren't in games yet.
Let's see what RT solution looks like in 5-6 years from now when next gen hardware will be around corner, I have a feeling current competition between Nvidia and AMD will greatly push for significant gains in that regard. Intel is also joining GPU market so we'll probably get a lot more innovation than we did last gen when Nvidia was unchallenged. I will be surprised if we don't get decent gains in RT this fall already when new GPUs arrive.
 
Last edited:

Kenpachii

Member
We have no clue what the PS7 will be about at this point. Tech is moving in lightning pace forwards and RT wasn't even a thing in 2013-2014, but became a thing in 2018.

It could very well be that in 4 years from now, when we are 3 gens further gpu wise, we will have far far far better hardware that deals with RT or even goes over it, specially with ai enhancements. We could also see massive stagnation but with the competition of intel and now amd i doubt that's going to happen.

I honestly think AI will change the game scene entirely in next gen consoles / pc's where demands will be far lower to get things going.
 
Last edited:

Edder1

Member
We have no clue what the PS7 will be about at this point. Tech is moving in lightning pace forwards and RT wasn't even a thing in 2013-2014, but became a thing in 2018.

It could very well be that in 4 years from now, when we are 3 gens further gpu wise, we will have far far far better hardware that deals with RT or even goes over it, specially with ai enhancements. We could also see massive stagnation but with the competition of intel and now amd i doubt that's going to happen.

I honestly think AI will change the game scene entirely in next gen consoles / pc's where demands will be far lower to get things going.
Yes, stagnation is highly unlikely as the tech is still in its infancy, plus as you mentioned competition in GPU market has never been as strong as it's about to get, at least not in the last 10 years.

There's also a case of Nvidia openly claiming that they expect rendering to fully move to path tracing by 2035 when PS7/Xbox6 launch (if consoles are still a thing), this means there have to be significant gains regularly when it comes to RT if we gonna get there within two consoles generations. If path tracing is expected in two console generations then RT GI should be a standard on next gen consoles (PS6/Xbox5).

ZBbLns3.png
 
Last edited:
You keep saying that when we have actual realtime footage showing otherwise.

u1Qncli.gif

oZL7VBi.jpg


We are there in cutscenes already. We are very close in gameplay. On PC, we will have better ray tracing, better resolution, better framerate and better fidelity than the 1080 sub 30 fps we got on consoles.

Saying we are planets away makes no sense when we are getting realtime graphics like this.
Bingo!
 

PUNKem733

Member
I'm not saying that at all. I'm saying that even with RT on the level of the CGI requires enormous bandwidth which no GPU has today. And no, the PS6 won't be close either. The consoles lag behind the PC GPUs by a significant margin, so we should be looking at GPUs from Nvidia/AMD instead of consoles. I'm looking for VRAM that's at least 64G and bandwidth that's plentiful where these games can be run easily at native 4k/60FPS with all the options turned up high and no reconstruction techniques. We won't be there with the next iteration of the consoles for sure. We'd be lucky to get a PS6 to have the capability of a 3090 today. And a 3090 is way off from CGI (with it's bandwidth) requiring DLSS to even run at a reasonable FPS.
I'm sorry, but thinking the PS6 in like 2028 or 2029 will only reach 3090 is ridiculous and hilarious. PS6 should be 60-80 TF at least since PC GPUs are coming that are over 100 TF within the next year.
 

01011001

Banned
I'm sorry, but thinking the PS6 in like 2028 or 2029 will only reach 3090 is ridiculous and hilarious. PS6 should be 60-80 TF at least since PC GPUs are coming that are over 100 TF within the next year.

I mean, take away the RT hardware and the PS5 and Series X is barely above a GTX1070... maybe at around or slightly above 1070ti levels thanks to low level APIs. those GPUs launched in 2016 and 2017 respectively, so 4 to 5 years before the current gen systems

now imaging the 3090 equivalent of that time, the Titan X or Titan XP... yeah... so that Titan X that launched 5 years before the current consoles has more GPU grunt than either of them.





edit: here is a Titan XP running Metro Exodus with raytracing set to Ultra, even tho it has no RT hardware...
 
Last edited:

PUNKem733

Member
I mean, take away the RT hardware and the PS5 and Series X is barely above a GTX1070... maybe at around or slightly above 1070ti levels thanks to low level APIs. those GPUs launched in 2016 and 2017 respectively, so 4 to 5 years before the current gen systems

now imaging the 3090 equivalent of that time, the Titan X or Titan XP... yeah... so that Titan X that launched 5 years before the current consoles has more GPU grunt than either of them.





edit: here is a Titan XP running Metro Exodus with raytracing set to Ultra, even tho it has no RT hardware...

OK this is a first I've heard RDNA2 without RT is around 1070 level. Sounds pretty ridiculous.
 

01011001

Banned
OK this is a first I've heard RDNA2 without RT is around 1070 level. Sounds pretty ridiculous.

RDNA2 is not a GPU... RDNA2 is an architecture my dude.
and yes, the RTX3050 is even a lover level! CRAZY I KNOW!

and it depends. some games legit run better on a 1070 than on PS5. especially if you have one of the many partner cards that run at up to 2ghz core clocks.

in the long run the consoles will leave these cards in the dust.
but not the Titan XP, which is also a 10 series Nvidia card (the RTX3090 of its time basically) that is more than 5 years old and can run Raytracing in games at surprisingly high performance levels even tho it has no RT acceleration hardware.
so that comment here is what is ridiculous 👇 as it is absolutely possible that the PS6 GPU will not be more powerful than the 3090, because that could be the case. even if it will be better, it will not be much better and comparable in many games for sure
I'm sorry, but thinking the PS6 in like 2028 or 2029 will only reach 3090 is ridiculous and hilarious

I couldn't find any benchmarks on youtube, but I would even bet that the Titan XP could run Control at almost PS5 level RT settings and performance
 
Last edited:
In addition to that. If someone posted a picture of that without the context. He would actually think that's a pic from the movie.
He won't go. Oh that's a game render. Its funny how biased we are when we know the context of something that challenges our world view. Reminds me of those blind taste test where the cheapest wine/drink is actually picked over the stupid expensive ones. Or the fake luxury store with cheap products with people lining up to get in and praising the products as they leave after unknowingly spending $5k on something that cost $50 dollars. lmao. Its crazy the tricks our mind plays on us.

If Lethal01 Lethal01 was showed that pic in passing he would 100% believe it was from the film. Not only would he believe that, he would also believe its realistic if someone told him it was CGI generated by 1,000 supercomputers. But the minute someone says its running real-time on an econobox....all of a sudden its "not even close to being realistic".

The funny thing is that the character in that CGI horizon video doesn't even look better than the realtime hellblade 2 reveal character that wasn't CGI.

hellblade-hellblade-2-senua-s-saga-1-1.jpg






All of this is correct.
 

01011001

Banned
Anyone thinking entire render farms and techniques used for movies is going to fit into a single at home console running a real time game is kidding themselves. There's lots of tricks to lessen said gap but you'll always pick the difference.

so you are saying multiple rays per pixel + per frame for the whole screen is unrealistic?! PFFT! pessimist!
 
so you are saying multiple rays per pixel + per frame for the whole screen is unrealistic?! PFFT! pessimist!
They still need to use techniques such as DLSS etc, there's always lighting shortcuts and the overhead of making a real time game simulation vs what they can fake in making a movie etc. I do think it's getting closer but I'm yet to see anything from the top tier games rival the top tier movies in terms of detail, realism, etc. Lighting is a massive one, and very performance hungry or tricked based when talking about games. The improvements with AI upscaling is profound and may close such a performance gap between movies and games but I doubt it will surpass.

Seeing productions using UE such as Mandalorian is likely to really push what games can do in the coming years, techniques and iterations. However anything gained can also be used by the movie industry along with larger budgets for effects, manpower and offline processing.

To me I'm far more interested in the raising the bar on quality while driving down the manpower required to deliver such results. I'm very curious what devs, studios and indies will achieve over say the next 1-10 years. Another golden age is likely upon us.

One of the big areas that has a huge processing cost is VR, there is nothing in VR that comes close to the visual quality of movies and is a marked decrease from current gen top tier games even. Sure the experience is wildly unique and immersive but the gains from closing the gap of movies and games combined with added processing of tech/GPUs/consoles will likely drive the realism of VR way up over the same period. It's the old adage of "most improved" over the "lesser difference" of the already consistent top performers.
 
Last edited:

01011001

Banned
How everyone sees the Matrix Awakens demo:
LaaqO2p.jpg

that is more accurate than you intended it to be I think. since Matrix Awakens, while being impressive, still has many flaws including the super low resolution that results in really bad artifacting from the upsampling while traversing the open city.

DLSS would be a godsend in that demo.
 

Edder1

Member
I mean, take away the RT hardware and the PS5 and Series X is barely above a GTX1070...
Either you're trolling or making a very ignorant comment. It's been proven time and time again through comparisons by the likes of NX Gamer and DF that current gen consoles are at 2080 (PS5) and 2080S (SX) level of performance without RT. Even RDNA1 RX 5700XT craps all over 1070 and that's quite a bit weaker than what consoles got.
 
Last edited:

01011001

Banned
Either you're trolling or making a very ignorant comment. It's been proven time and time again through comparisons by the likes of NX Gamer and DF that current gen consoles are at 2080 (PS5) and 2080S (SX) level of performance without RT. Even RDNA1 RX 5700XT craps all over 1070 and that's quite a bit weaker than what consoles got.

a 2080 is also just above a 1070. the jump from the 10 to the 20 series cards was famously disappointing at the time. the focus on the upgrade back then was the RT and Tensor cores.

the fact still stands that there are games that run barely better than on a 1070. you can of course excuse this with "bad console ports" but it's still true.

and the main argument was about the 3090. there is a good chance that the PS6 will have a gpu on the level of a 3090, that is pretty realistic given that the current consoles fall behind the flagship of the 10 series cards from 5 years ago, the Titan XP, if you don't use raytracing. and even with raytracing the difference is smaller than one would think given the missing RT cores.
 

Edder1

Member
a 2080 is also just above a 1070. the jump from the 10 to the 20 series cards was famously disappointing at the time.
Lol, RTX 2080 is 37-40% (depending on various benchmarks) more powerful than 1070. This is without taking into account architectural improvements in RDNA2 compared to Pascal.
 
Last edited:

01011001

Banned
Lol, RTX 2080 is 37-40% (depending on various benchmarks) more powerful than 1070. This is without taking into account architectural improvements in RDNA2 compared to Pascal.

and most games so far do not perform like a 2080 at all on current consoles.

we have Control or Guardians of the Galaxy that, as I said, barely run bettert han on a 1070.

Control on PS5 is 1440p60 on low settings. that is between 1070 and 1080 levels of performance.

Guardians of the Galaxy is 1080p60 with a mix of medium and high. and that also is very close to how a 1070 runs the game.

Dying Light 2, 1080p medium, very close to PS5 performance on an overclocked 1070 or a 1070ti on stock clocks.
not quite PS5 levels of performance but not far off either

there are games that favour the RDNA2 GPUs in the consoles but some fall short and are not as impressive.
and like I said, as time goes on and games get less and less optimised for older cards this gap will widen, but right now it's not far off
 
Last edited:
Top Bottom