• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Are Current Generation Graphics A Bit Of A Let Down?

viveks86

Member
I'm a pessimist by nature so i'm not gonna believe until i can play on my oled.

The game look too far and beyond what we saw on these consoles.

Also i'm pretty sure that rockstar already had to deal with downgrades with one of their games before, can't remember if it was gta4 or 5.

Like cmon, this scene in the trailer looked like real life
Kxss5r9.jpeg


Or this scene

giphy.gif
Downgrades typically happen when the game is announced too early and they have to scale things back because they either got way too ambitious with the trailer and couldn't keep up with the in-engine target or the reveal was some BS target render that didn't actually exist. The trailer was after several years of development, so I'm not as skeptical. Rockstar hasn't fallen into this trap so far like Watch Dogs or Witcher 3. Worst case scenario would be some minor cutbacks, like TLOU 2. It will be pretty damn close. For RDR 2, I believe it even got better.
 
Last edited:

GymWolf

Member
Downgrades typically happen when the game is announced too early and they have to scale things back because they either got way too ambitious with the trailer and couldn't keep up with the in-engine target or the reveal was some BS target render that didn't actually exist. The trailer was after several years of development, so I'm not as skeptical. Rockstar hasn't fallen into this trap so far like Watch Dogs or Witcher 3. Worst case scenario would be some minor cutbacks, like TLOU 2. It will be pretty damn close. For RDR 2, I believe it even got better.
Tlou2 had a huge downgrade compared to the reveal trailer, if it is the same level of downgrade we are fucked.

Worst enemy models
Worse animations
Worse ai
Worse gore effects
Worse lights system
Etc.

I think someone did an analysis, ND always do bullshit reveal and then downgrade, tlou1, u4 and tlou2, interbaldactic doesn't look anything special compared to the best we saw so there is a chance it's gonna look like that.


I still remember this absolute bullshit


 
Last edited:

rm082e

Member
Has it changed much since the PS4 era outside of resolution and frame rate? No. Am I disappointed by that? Nope.

The biggest disappointment for me last generation was seeing more of a focus on higher resolution with the Pro and not frame rate. I spent too much time playing games that ran below 30fps. As someone who doesn't want to play any games running less than 60, the PS5 is getting better.

But I don't see how this is on the console manufacturers when I'm also not seeing anything with crazy new visual fidelity on PC.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I still remember this absolute bullshit



This was still a less egregious than announcing Uncharted 4 with a 60fps trailer and saying that's the target


That’s Nathan Drake rendered in full 1080p glory, using the power of our PS4 engine. All footage you see in the trailer was captured completely in engine. We’re targeting 60fps for Uncharted 4: A Thief’s End, and as you can see, the visual fidelity for our character models will reach new heights. In fact, thanks to the power of PS4, right now Drake’s Uncharted 4 model is over double the polygons of Joel from The Last of Us PS3.
 

yogaflame

Member
Raytracing is good but its too demanding too the systems right now and you need a very expensive pc to have a decent one. I hope developers will focus more on 4k and 60 fps target and also better physics.
 

Zacfoldor

Member
Only to people who want to play MH:W, apparently. The rest of us are absolutely fine. The PC players make fun of us and say that we would eat dogshit if they served it up. Look at MHW, PC players claim it runs better on their PC vs best console(PS5 Pro) yet the user scores for the console game are amazing. The user scores on PC are abysmal. Pro critics agree with the console gamers.

I think when you invest so heavily and submerge yourself so deeply into PC gaming that you can't see the game for the performance, you have lost something that made gaming so special. Back in the day every game seemed to have slowdown. WoW was CPU limited and could not hit 60 in raid for over a decade. The graphics sucked but it was defended for artstyle. PC people were not always like this. I was one.

I think as some get older they naturally begin to dislike gaming but they are afraid of getting older so they get peter pan syndrome and invest heavily in gaming even though they no longer enjoy it and can barely bring themselves to play it. Yet they don't want to lose such a strong tie to their youth so they are gamers in name only. GINO. If you think about it, this HAS to be a thing right? We are just arguing about the number of people it affects.
 
Last edited:

GymWolf

Member
This was still a less egregious than announcing Uncharted 4 with a 60fps trailer and saying that's the target

Nah, the u4 was just a small teaser with nate waking up in the beach, tlou2 was a long reveal gameplay worth of bullshit.

Final nate on cutscene is extremely similar to reveal nate.
Ai, enemy models and some other things in tlou 2 got a huge hit from reveal to final.

The point doesn't change, ND are bullshit promisers.
(And i don't even know if promisers is an actual word or not)
 
Last edited:

kevboard

Member
Only to people who want to play MH:W, apparently. The rest of us are absolutely fine. The PC players make fun of us and say that we would eat dogshit if they served it up. Look at MHW, PC players claim it runs better on their PC vs best console(PS5 Pro) yet the user scores for the console game are amazing. The user scores on PC are abysmal. Pro critics agree with the console gamers.

all this shows is that console players have lower standards.


I think when you invest so heavily and submerge yourself so deeply into PC gaming that you can't see the game for the performance, you have lost something that made gaming so special. Back in the day every game seemed to have slowdown. WoW was CPU limited and could not hit 60 in raid for over a decade. The graphics sucked but it was defended for artstyle. PC people were not always like this. I was one.

slow performance 20 years ago isn't comparable to the shit we see today.

back then games that had very hardcore requirements and had framerate issues on high end hardware were ambitious titles that pushed the medium forward.

now the games that run like ass and have ridiculous requirements A: barely look better than games that ran fine on GPUs from 2013, and B: have basically no ambition behind them to push the industry forward in terms of design or scope.


let's take Prey 2017 as an example. that is a game with very high object density. objects that are dynamic, have physics, can be used meaningfully during gameplay for different strategies, and of course normal pickups.
it still looks good today as well thanks to a strong art design and good technology running it.

that game ran at a flawless 1440p 60fps at near max settings without any reconstruction or upscaling on my old PC.

fast forward 7 years and my current PC, which is on paper 2.5x as powerful on the GPU and 2x as powerful on the CPU, can barely run Monster Hunter at upscaled 1440p 60fps, all while the game looks like literal pixel diarrhea in motion and has slowdown while looking around an empty brown and pixelated desert.


same in Silent Hill 2 remake. in order to hit an unstable 30fps on console, that game has to run at 1200p while being small scale, barely interactive and in no way pushes game design in any ambitious direction. instead it uses Lumen as a crutch and shortcut to cheap out on hand authored lighting, at the expense of performance and image quality.

Prey ran at 1440p on a console half as powerful and is more ambitious gameplay and interaction wise.


we have never been in a situation like this before. never have I seen an objective regression in image quality, graphics quality and performance to the extent we see now.
the original Silent Hill 2 on PS2 was ambitious with tons of dynamic shadows that made excellent use of the PS2's hardware features. it was a generational jump in performance, image quality and effects quality.
the remake on the other hand looks worse than games that released on consoles with only ⅒ the GPU grunt...
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
all this shows is that console players have lower standards.




slow performance 20 years ago isn't comparable to the shit we see today.

back then games that had very hardcore requirements and had framerate issues on high end hardware were ambitious titles that pushed the medium forward.

now the games that run like ass and have ridiculous requirements A: barely look better than games that ran fine on GPUs from 2013, and B: have basically no ambition behind them to push the industry forward in terms of design or scope.


let's take Prey 2017 as an example. that is a game with very high object density. objects that are dynamic, have physics, can be used meaningfully during gameplay for different strategies, and of course normal pickups.
it still looks good today as well thanks to a strong art design and good technology running it.

that game ran at a flawless 1440p 60fps at near max settings without any reconstruction or upscaling on my old PC.

fast forward 7 years and my current PC, which is on paper 2.5x as powerful on the GPU and 2x as powering on the CPU, can barely run Monster Hunter at upscaled 1440p 60fps, all while the game looks like literal pixel diarrhea in motion and has slowdown while looking around an empty brown and pixelated desert.


same in Silent Hill 2 remake. in order to hit an unstable 30fps on console, that game has to run at 1200p while being small scale, barely interactive and in no way pushes game design in any ambitious direction. instead it uses Lumen as a crutch and shortcut to cheap out on hand authored lighting, at the expense of performance and image quality.

Prey ran at 1440p on a console half as powerful and is more ambitious gameplay and interaction wise.


we have never been in a situation like this before. never have I seen an objective regression in image quality, graphics quality and performance to the extent we see now.
the original Silent Hill 2 on PS2 was ambitious with tons of dynamic shadows that made excellent use of the PS2's hardware features. it was a generational jump in performance, image quality and effects quality.
the remake on the other hand looks worse than games that released on consoles with only ⅒ the GPU grunt...
And you took the bait.
 

viveks86

Member
Tlou2 had a huge downgrade, if it is the same level of downgrade we are fucked.


I still remember this absolute bullshit



That's how characters looked in cutscenes in the final version. For that trailer, they seem to have set the gameplay lighting to match cutscene lighting, where every angle is perfectly lit (as it is a 100% scripted vertical slice), there is no light bleeding on characters, high quality sub surface scattering is on, "hero lights" etc. So yes, in comparison, the AI can look relatively bad at times in gameplay. There were other cutbacks as well, like AO and how the fire or the cars react when you bump against them (which they added back in part 1 for cars). I hope they didn't make the same mistake with the short gameplay snippet for Intergalactic, as it looked phenomenal and I'd be sad if the final game doesn't match that.

I wouldn't call it a huge downgrade though. The game still looks mind-blowing, the cutscenes match or even exceed the trailers at times. Witcher 3 was a huge downgrade, imo. We literally couldn't achieve that look even after adding in ray tracing and maxing out on a PC to date, 10 years later. They swapped out the entire lighting system and vegetation to support dynamic ToD, weather and massive landscapes, so it was like trying to get apple juice out of oranges. Expectations and reality were completely misaligned towards launch with every subsequent trailer cutting back further. You can get pretty close to the trailer now and it looks pretty damn good (playing it and loving it), but it's still not quite there. So many effects like volumetric smoke and fire, high poly meshes, handling of transparencies, distant LoD and high quality DoF are simply not there in the final game.

We haven't seen any final gameplay for GTA outside of dev leaks, but at the very least, I won't expect cutscenes to look much different than the trailer.
 
Last edited:
Absolutely, anyone saying otherwise is blind or haven't experienced a true generational jump in their lives.
I'm not blind and I've been gaming since 83 and to say the gen hasn't done stuff that is way beyond the last console gen is for the birds.

Sadly it is the number of titles that show off the current gen that's the issue rather than the tech
 

viveks86

Member
This was still a less egregious than announcing Uncharted 4 with a 60fps trailer and saying that's the target

Agree 100%. This, in my opinion, was where they fell wayyy short of their ambitions. They only almost reached half their goal between reveal and launch. Whole new hardware and remasters needed to get released to approach their initial goal.
 
Last edited:

GymWolf

Member
That's how characters looked in cutscenes in the final version. For that trailer, they seem to have set the gameplay lighting to match cutscene lighting, where every angle is perfectly lit (as it is a 100% scripted vertical slice), there is no light bleeding on characters, high quality sub surface scattering is on, "hero lights" etc. So yes, in comparison, the AI can look relatively bad at times in gameplay. There were other cutbacks as well, like AO and how the fire or the cars react when you bump against them (which they added back in part 1 for cars). I hope they didn't make the same mistake with the short gameplay snippet for Intergalactic, as it looked phenomenal and I'd be sad if the final game doesn't match that.

I wouldn't call it a huge downgrade though. The game still looks mind-blowing, the cutscenes match or even exceed the trailers at times. Witcher 3 was a huge downgrade, imo. We literally couldn't achieve that look even after adding in ray tracing and maxing out on a PC to date, 10 years later. They swapped out the entire lighting system and vegetation to support dynamic ToD, weather and massive landscapes, so it was like trying to get apple juice out of oranges. Expectations and reality were completely misaligned towards launch with every subsequent trailer cutting back further. You can get pretty close to the trailer now and it looks pretty damn good (playing it and loving it), but it's still not quite there. So many effects like volumetric smoke and fire, high poly meshes, handling of transparencies, distant LoD and high quality DoF are simply not there in the final game.

We haven't seen any final gameplay for GTA outside of dev leaks, but at the very least, I won't expect cutscenes to look much different than the trailer.
That gif is not a cutscene, they tried to pass that shit as gameplay.

The game was severely downgraded in various areas, definitely not the example to follow for rockstar if our hope is to have a game that look 1:1 with the reveal down the smallest details.

A game that look 90% like the reveal is not a game that look 100% like the reveal and usually that 10% missing were the incredible details that made the trailer special to begin with.
 
Last edited:

tr1p1ex

Member
I wouldn't say letdown. I would say Doh! Completely expected. The diminishing returns have been happening for awhile.

Also look at what has happened in the marketplace. So many big games don't have the cutting edge graphics.
I don't think that was the case 20+ years ago.
 
Last edited:

FoxMcChief

Gold Member
I'm a pessimist by nature so i'm not gonna believe until i can play on my oled.

The game look too far and beyond what we saw on these consoles.

Also i'm pretty sure that rockstar already had to deal with downgrades with one of their games before, can't remember if it was gta4 or 5.

Like cmon, this scene in the trailer looked like real life
Kxss5r9.jpeg


Or this scene

giphy.gif
I love that dudes videos.

 

viveks86

Member
That gif is not a cutscene, they tried to pass that shit as gameplay.
Yeah I know. That's what I'm saying as well. They thought they can achieve cutscene quality lighting in gameplay and fell short. They still achieved that in cutscenes, which is why I don't consider it as big a letdown.

not the example to follow for rockstar if our hope is to have a game that look 1:1 with the reveal down the smallest details.
We can agree on that. Ideally there should be no downgrades at all, obviously. Anything short of that is false advertising after all. Unfortunately, the nature of game development is such that they don't always hit their goals. We can only hope that they do. Anyway, we will probably know soon enough. Really hope they show something by July at least.
 

GymWolf

Member
Yeah I know. That's what I'm saying as well. They thought they can achieve cutscene quality lighting in gameplay and fell short. They still achieved that in cutscenes, which is why I don't consider it as big a letdown.


We can agree on that. Ideally there should be no downgrades at all, obviously. Anything short of that is false advertising after all. Unfortunately, the nature of game development is such that they don't always hit their goals. We can only hope that they do. Anyway, we will probably know soon enough. Really hope they show something by July at least.
Showing cutscenes models in gameplay is a huge downgrade if you ask me but ok, i'm particularly fixated with models and animations so that shit was a huge letdown for me, i used that gif at nauseam to promote the game before launch so i also made a joke of myself...

I still clearly remember The ps4 version of tlou1 making a huge deal about having cutscenes models during gameplay...

Let's hope rockstar can do a miracle but i prefer to be pessimist and being surprised than the opposite.
 
Last edited:
people obsessed with graphics aren't real gamers
giphy.gif


get where youre coming from, but graphics affect gameplay and the whole shebang
if we never got past 8-bit graphics id kill myself.

and 4k is super underrated--increased res. improves so many things, not just jaggies.
go emulate a ps2 game set to an internal res of 8k--it's like seeing the game's textures for the first time.

but graphics at the cost of gameplay?
trash.
 

viveks86

Member
i used that gif at nauseam to promote the game before launch so i also made a joke of myself...
Awww. Probably why Witcher 3 pains me so much as I did the same with that game. I even went through a post-launch denial when I'd max it out after tweaking INI files, add reshade filters and declare there was no actual downgrade and GAF was just imagining things :messenger_grinning_sweat:
 
Last edited:
I was thinking this yesterday playing Monster Hunter, pretty sure that the graphics are worse than World lol. Or the improvements are negligible. How it looks in the pre-rendered cutscenes is how I would've expected this gen to look during gameplay. Then it switches to gameplay and I'm look, oh yeah, right.

Fun game, though.
 

Viruz

Member
It's true but only because "The next generation doesn't start until we say it does" (2006), it was in another context but when the SONY first party games arrive everything will be alright.

images
 

tr1p1ex

Member
giphy.gif


get where youre coming from, but graphics affect gameplay and the whole shebang
if we never got past 8-bit graphics id kill myself.

and 4k is super underrated--increased res. improves so many things, not just jaggies.
go emulate a ps2 game set to an internal res of 8k--it's like seeing the game's textures for the first time.

but graphics at the cost of gameplay?
trash.
many leaps in graphics tech make gameplay worse as much as anything.

the leap from 2.5d sprites to 3d made games worse for awhile. The interactivity went way down. How many objects you could have on the screen went way down. I saw it in shooters and rts games. Rts kind of died out when they moved to 3d.

The move to 4k is producing a similar effect. What more they could do in games if they didn't have to move 4x as many pixels. The gameplay takes a step back.

Eventually some of these things are overcome with more computing power. But the fact would still remain that going back in graphics tech can benefit gameplay and games. You can make content faster, do more content, more interactivity, more objects on the screen and/or have lesser hardware requirements among other things.
 
Last edited:

viveks86

Member
The move to 4k is producing a similar effect. What more they could do in games if they didn't have to move 4x as many pixels. The gameplay takes a step back.

Eventually some of these things are overcome with more computing power. But the fact would still remain that going back in graphics tech can benefit gameplay. And a few other facts remain like the time to make games with cutting edge graphics takes so much longer today than it did.
I feel like this becoming a bit of a strawman argument in the current age of upscaling tech. Who even plays at native 4k anymore so that everything else needs to be compromised? Gameplay design and graphics are two different departments and they can push the boundaries as much as they are capable of. The issue with gameplay not evolving is either a lack of creativity (what gameplay ideas are left unexplored?), competency (do we have devs that can pull it off?) or a lack of tech evolution in that space (such as believable AI NPC behavior). Things like ray tracing and dynamic lighting should actually pave the way for even more creative gameplay (such as fully destructible environments), but they don't because people are still figuring out how to make such an experience that is fun and scalable across an entire game.

Gameplay and graphics go hand in hand, but the days when one needs to cannibalize the other are slowly coming to an end with improvements in game engines. Look at Clair Obscur, a no-name studio pulling that off as their first game is proof that these things are now sufficiently decoupled.
 
Last edited:

tr1p1ex

Member
I feel like this becoming a bit of a strawman argument in the current age of upscaling tech. Who even plays at native 4k anymore so that everything else needs to be compromised? Gameplay design and graphics are two different departments and they can push the boundaries as much as they are capable of. The issue with gameplay not evolving is either a lack of creativity (what gameplay ideas are left unexplored?), competency (do we have devs that can pull it off?) or a lack of tech evolution in that space (such as believable AI NPC behavior). Things like ray tracing and dynamic lighting should actually pave the way for even more creative gameplay (such as fully destructible environments), but they don't because people are still figuring out how to make such an experience that is fun and scalable across an entire game.

Gameplay and graphics go hand in hand, but the days when one needs to cannibalize the other are slowly coming to an end with improvements in game engines. Look at Clair Obscur, a no-name studio pulling that off as their first game is proof that these things are now sufficiently decoupled.

we've long had upscaling. When we went to HD ...most games weren't true HD ...they hit some lower target and were upscaled to HD. IT doesn't change that higher resolution ( real resolution at least) takes more horsepower...horsepower that could be used elsewhere often times for the better or doesn't require as expensive of hardware...


BotW faked ray tracing and was a beautiful looking game with beautiful lighting.

Battlefield had fully destructible environments 17 years ago. No ray tracing.

I'd say the focus on graphics actually only sets back something like fully destructible environments. Your game is likely to look worse compared to other games. Hence the reluctance of studios to jump off the ever more prettier screenshot bandwagon.

Another example of this is Metroid Prime 25 years ago. It did 60 fps. Most games did 30. 60 was so much more fluid. But it didn't make for a prettier screenshot. So most games stuck to 30. Better graphics got in the way of better gameplay.

And this by definition can't come to an end. Because the computing power at any point in time of a console for example is finite. The more you're pushing the cutting edge graphics, the less power available for everything else ie for gameplay.
 
Last edited:

viveks86

Member
we've long had upscaling. When we went to HD ...most games weren't true HD ...they hit some lower target and were upscaled to HD. IT doesn't change that higher resolution ( real resolution at least) takes more horsepower...horsepower that could be used elsewhere often times for the better or doesn't require as expensive of hardware...


BotW faked ray tracing and was a beautiful looking game with beautiful lighting.

Battlefield had fully destructible environments 17 years ago. No ray tracing.

I'd say the focus on graphics actually only sets back something like fully destructible environments. Your game is likely to look worse compared to other games. Hence the reluctance of studios to jump off the ever more prettier screenshot bandwagon.

Another example of this is Metroid Prime 25 years ago. It did 60 fps. Most games did 30. 60 was so much more fluid. But it didn't make for a prettier screenshot. So most games stuck to 30. Better graphics got in the way of better gameplay.

And this by definition can't come to an end. Because the computing power at any point in time of a console for example is finite. The more you're pushing the cutting edge graphics, the less power available for everything else ie for gameplay.
All fair points. What I'm saying is these points are getting outdated real fast with every new breakthrough in rendering. Most, if not all games are now pushing 60 (much to SlimySnake SlimySnake 's chagrin). Your game doesn't have to look worse anymore with fully destructible environments if all the lighting is dynamic anyway.

Games can already look great using cutting edge tech without needing to compromise on gameplay. Once path tracing becomes a reality on console level hardware, the only thing holding devs back graphically would be creativity and art direction. This is likely 2 console generations away from reality and much sooner on PCs. Other major grunt work (like textures, mocap etc) will all get accelerated by AI driven workflows, which are also already happening. Things that MetaHuman can do were things an indie studio could never imagine having access to 10 years ago. Now they just need to be optimized further to become viable for full scale projects.

We are almost there. We just need to be patient (if we care about the industry's trajectory). Of course every department has a budget and devs would prioritize based on their vision for a game. But soon, I believe, it won't be the case that graphics would have to compromise gameplay. It is highly likely that the new bottleneck will become dev competence as it has always been. Like you said, BOTW didn't even need any of this tech and it still looked great and didn't compromise on gameplay. These advancements will simply remove any and all excuses for the rest of the industry.

Like I said earlier in this thread, as a consumer, I definitely feel let down this generation (outside of may be 10 games that are exceptions). We seem to be stuck in a plateau for a majority of the games. But as an industry enthusiast, I can see that the future looks really bright (for graphics).
 
Last edited:
The obvious answer is that devs no longer have any time to optimize their games pre-release.

Pubs have squeezed dev schedules and budgets so hard that optimization now means turn down/off visual features to get a stable frame-rate (or not in the case of many PC ports).

Remember on PS3 when SSM spent 6 months optimising GOW3. The game went from pretty cool looking early pre-release footage to "OMGWTFBBQ... HOW DID THEY DO THAT ON A PS3?!?!"

The hardware technology is there to make games shine today. But with devs no longer having any time to optimize their code around it, they can't squeeze the best out of it. This is only exacerbated by having to target additional platforms, e.g. like FPs having to target PC also, and everyone having to also target Pro versions of the console.

I mean, we're how many years into the gen already and almost no new games are taking advantage of mesh shaders or the advanced I/O and decompression; even first party devs!!!!
 
Last edited:

tr1p1ex

Member
All fair points. What I'm saying is these points are getting outdated real fast with every new breakthrough in rendering. Most, if not all games are now pushing 60 (much to SlimySnake SlimySnake 's chagrin). Your game doesn't have to look worse anymore with fully destructible environments if all the lighting is dynamic anyway.

Games can already look great using cutting edge tech without needing to compromise on gameplay. Once path tracing becomes a reality on console level hardware, the only thing holding devs back graphically would be creativity and art direction. This is likely 2 console generations away from reality and much sooner on PCs. Other major grunt work (like textures, mocap etc) will all get accelerated by AI driven workflows, which are also already happening. Things that MetaHuman can do were things an indie studio could never imagine having access to 10 years ago. Now they just need to be optimized further to become viable for full scale projects.

We are almost there. We just need to be patient (if we care about the industry's trajectory). Of course every department has a budget and devs would prioritize based on their vision for a game. But soon, I believe, it won't be the case that graphics would have to compromise gameplay. It is highly likely that the new bottleneck will become dev competence as it has always been. Like you said, BOTW didn't even need any of this tech and it still looked great and didn't compromise on gameplay. These advancements will simply remove any and all excuses for the rest of the industry.

Like I said earlier in this thread, as a consumer, I definitely feel let down this generation (outside of may be 10 games that are exceptions). We seem to be stuck in a plateau for a majority of the games. But as an industry enthusiast, I can see that the future looks really bright (for graphics).

It doesn't work like that because in 10 years the graphics bleeding edge will only be higher. And people will want those graphics.

And the graphics vs gameplay tradeoff is largely due to the hardware being static at any point in time. The more processing power devoted to graphics the less you can devote to gameplay.

There's also the budget limit. While AI perhaps can help with that, it isn't like there hasn't been constant improvement and great automation in what one artist can do over the years. That hasn't stopped the army of artists needed for a big AAA game from increasing.
 
Last edited:
Only to people who want to play MH:W, apparently. The rest of us are absolutely fine. The PC players make fun of us and say that we would eat dogshit if they served it up. Look at MHW, PC players claim it runs better on their PC vs best console(PS5 Pro) yet the user scores for the console game are amazing. The user scores on PC are abysmal. Pro critics agree with the console gamers.

I think when you invest so heavily and submerge yourself so deeply into PC gaming that you can't see the game for the performance, you have lost something that made gaming so special. Back in the day every game seemed to have slowdown. WoW was CPU limited and could not hit 60 in raid for over a decade. The graphics sucked but it was defended for artstyle. PC people were not always like this. I was one.

I think as some get older they naturally begin to dislike gaming but they are afraid of getting older so they get peter pan syndrome and invest heavily in gaming even though they no longer enjoy it and can barely bring themselves to play it. Yet they don't want to lose such a strong tie to their youth so they are gamers in name only. GINO. If you think about it, this HAS to be a thing right? We are just arguing about the number of people it affects.
There is a Mark Twain story about his time as a steamboat pilot apprentice. It talks about how the pilot's ability to read the water is like magic, and Twain is filled with awe at the river.

It ends with Twain being able to read the water, and all the magic is gone.

This has been my experience after learning more about how video game graphics work. I'm less immersed in the game's world, but thinking about the technical side of it and wanting to learn how it was pulled off.

I still love games, but I wonder if I'd love them more if I was ignorant to how the sausage is made.
 

viveks86

Member
It doesn't work like that because in 10 years the graphics bleeding edge will only be higher. And people will want graphics that look that way.

Until people stop caring about that... there will remain a tradeoff between gameplay and graphics.
Most games these days that look good but aren't fun or innovative are just made by people who don't know how to make the game fun or innovative in the first place. So they are, essentially, putting lipstick on a pig.

Of course people will always want better graphics. People have been asked to stop caring about that since 2D to 3D and it has never happened. We still have a long way to go to approach current CGI or actual photo realism, before even that becomes a non-issue. This is in our very nature, the desire to mimic (or even improve) reality in everything we do. So that applies to graphics as well.

But I'm only talking about the gameplay dependencies on graphics and sacrifices being made as a result. You think there will always be a tradeoff that devs make, where gameplay systems need to be sacrificed even if they are fun and viable, because they won't look as good. And I think that won't be the case after everything becomes real-time and dynamic and it'll come down to the talent needed to design gameplay systems that are fun and viable.

The pursuit for photorealism will simply continue independently, in my opinion. But we will probably not agree on that point.
 
Last edited:

tr1p1ex

Member
Most games these days that look good but aren't fun or innovative are simply made by people who don't know how to make the game fun or innovative in the first place. So they are, essentially, putting lipstick on a pig.

Of course people will always want better graphics. People have been asked to stop caring about that since 2D to 3D and it has never happened. We still have a long way to go to approach current CGI or actual photo realism, before even that becomes a non-issue. This is in our very nature, the desire to mimic (or even improve) reality in everything we do. So that applies to graphics as well.

But I'm only talking about the gameplay dependencies on graphics and sacrifices being made as a result. You think there will always be a tradeoff that devs make, where gameplay systems need to be sacrificed even if they are fun and viable, because they won't look as good. And I think that won't be the case after everything becomes real-time and dynamic and it'll simply come down to the talent needed to design gameplay systems that are fun and viable.

The pursuit for photorealism will simply continue independently, in my opinion. But we will probably not agree on that point.
Not sure it's a belief. IT's how things work. You only have so much computing power. The more you need it for gameplay the less you can use for graphics and vice versa. There's a reason racing and fighting games have traditionally had the best graphics.

That tradeoff remains even if 1 artist is all that is needed for a AAA game in the AI future.
 
Last edited:
Keep demanding 60 and 120fps and graphics will always “underwhelm”
This is an oversimplification. GTA6 is aiming for 30fps and it doesn’t look a generational leap beyond RDR2. We’re experiencing diminishing returns in graphics, by then cutting polygon count for 60fps isn’t going to be a visual leap aside from image “quality”. The fact that many games can’t even match the Order: 1886 in graphics shows that talent isn’t there or “current gen” has hit the Moore’s Law limit.
 

nnytk

Member
I expected a lot more draw distance and "wow" moments.

I don't care about native 4K and raytracing is nearly always implemented poorly or partially.

MH Wilds and RE4 look incredible at times, and I loved Astro Bot from a technical perspective. But comparing these to Xenoblade Chronicles X for example, one wonders... Where's the talent and vision for current gen games that "do more with less" like Xenoblade does...

Instead of "do more with less" I feel like we've been getting a lot of "do less with more" from developers.
 
Last edited:

viveks86

Member
Not sure it's a belief. IT's how things work. You only have so much computing power.
Gameplay logic primarily requires CPU. Graphics primarily requires GPU. The cpu is rarely ever maxed out and even when it is, it is because of said gameplay logic (and often unoptimized, such as DD2). Not graphics. This claim that gameplay and graphics are competing for the same compute power is factually inaccurate.

Fighting games used to look better than the average game because there is nothing else that needs to be rendered outside of the fixed camera. Racing games are primarily about the cars and, like real life, the visual appeal of cars is a major selling point. So they just put a ton of resources towards the cars and secondary aspects like crowds and trackside details get lower priority. It has nothing to do with gameplay logic or your claimed lack of it. Sim racers actually have a ton of gameplay logic, and all that happens on cpu, independent of rendering. They still manage to look phenomenal.

Hellblade 2, a graphical benchmark, has barebones gameplay. Not because all the graphics took up the cpu. It doesn’t even need a high end cpu. The reason is the dev wanted a cinematic experience and gameplay was not as much a priority. The issue was a creative/financial/leadership choice, not a technical one.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The obvious answer is that devs no longer have any time to optimize their games pre-release.

Pubs have squeezed dev schedules and budgets so hard that optimization now means turn down/off visual features to get a stable frame-rate (or not in the case of many PC ports).

Remember on PS3 when SSM spent 6 months optimising GOW3. The game went from pretty cool looking early pre-release footage to "OMGWTFBBQ... HOW DID THEY DO THAT ON A PS3?!?!"

The hardware technology is there to make games shine today. But with devs no longer having any time to optimize their code around it, they can't squeeze the best out of it. This is only exacerbated by having to target additional platforms, e.g. like FPs having to target PC also, and everyone having to also target Pro versions of the console.

I mean, we're how many years into the gen already and almost no new games are taking advantage of mesh shaders or the advanced I/O and decompression; even first party devs!!!!
Speaking of GOW3, it was next gen as fuck because back then those devs cared about pushing the bar. This time around, they chose to make it last gen only, and went out of their way to keep the visuals the same across both consoles unlike say GG who actually had a fairly significant leap going from HZD to HFW because they didnt mind HFW dropping to 720p (literally half of HZD on PS4). SSM devs proudly claimed that they kept the visuals the same. Well, fuck them lazy bastards.

Sony gave them 4.5 years to make a copy pasta sequel and they didnt even improve the graphics. Another 6 months optimizing it wouldnt have changed the core visual design.

Same goes for Spiderman 2. INsomniac leak showed that they purposefully kept the visuals the same because they think improving graphics wont be noticeable by the masses.

That said, I agree. the Hardware is there. The tech is there. They just need to use it. Aside from insomniac, not a single sony first party studio has used ray tracing. Not a single one. No one has used mesh shaders. Aside from insomniac, no one has the used the IO to do anything other than fancy loading times.

At least MS studios have wised up and are using UE5 which utilizes ray tracing and mesh shaders as part of their lumen and nanite tech, but they had a rough few years of cross gen only games too.

Third parties have faired much better, but it was Sony studios that set the bar last gen and with them being MIA for the first four years outside of HFW which itself was cross gen, people are starved for truly next gen content only sony studios can provide. Looking at their upcoming titles like Ghost of Yotei,Death Stranding 2, Wolverine (leaked footage looks mid), and Intergallactic, i think those people will remain hungry.

I personally have found solace in Massive's two amazing lookin games. I thought Callisto looked insane. I was very impressed by starfield's interiors though those procedurally generated worlds look like dogshit. Star Wars Jedi, Wukong, Alan Wake 2, Silent Hill 2, and a couple of other games have given me hope for the rest of the gen. Honestly, im not as depressed as i was at the start of 2023. graphics look amazing in most games i play.
 

SlimySnake

Flashless at the Golden Globes
DF doesn't go hard enough when it comes to that dogshit engine imo. They still blindly praise RT GI just for being present, even if it looks like diarrhea in basically all UE5 games.
when you have to install mods and do manual .ini tweaks to get a game like Silent Hill 2 to not look like absolute garbage, then there's nothing worth praising.


and the post you quoted about how great games would look if they pushed graphics at 1080p 30fps is quite hilarious in combination with trying to say UE5 games get too much criticism.

UE5 at its core, completely breaks apart at low resolutions like 1080p.
nearly everything in UE5 is dithered. as in, basically imagine if every single shadow, every single cloud you see in the sky, every single strand of hair, is missing half the pixels in a checkerboard pattern... that's how UE5 works by default.

UE5 is designed entirely around being upsampled and reconstructed, as a clean non smeared image would make games look broken. the smearing from reconstruction and/or agressive TAA is needed, while also making everything look blurred and adds ghosting.
so low resolution games in UE5 rely entirely on denoising and upscaling to create anything resembling a remotely coherent looking image... and often fails at doing so.

TLDR: 1080p 30fps in UE5 in its default form looks disgusting, as most effects it uses would run at basically checkerboard 1080p, then get smeared over multiple frames to hide that fact.
the only type of UE5 game that would look good at 1080p 30fps is one that basically uses none of its features and runs with only mid-last-gen UE4 level features like Gears 5 or Gears 4 for example... that's the only way to make a 1080p UE5 game look presentable, at which point you can just as well run that game at dynamic 4k 60fps... like Gears 5...
Matrix runs at 1080p reconstructed to 4k using TSR and is still the best looking thing made this gen. I am currently playing Professional Baseball Spirits, which runs at 1080p on the PS5 and looks absolutely amazing. especially compared to that garbage thats mlb the show running at native 4k 60 fps.

To be clear, i dont want games running at 1080p 30 fps. I dont think i said that but if i did, i would like to retract it. I think games should be targeting a minimum of 1440p 30 fps reconstructed to 4k. Basically 4k dlss/tsr/fsr2 quality. Just like the very first UE5 demo which everyone though looked 4k until Epic confirmed that its actually being reconstructed using TSR. Then the 60 fps mode can drop to 1080p and reconstructed. Anything below and its garbage. And thats what I was trying to say. Do NOT go below 1080p if your game cant handle it. Do what Ninja Theory did and just release a 30 fps game instead.
 

FewRope

Member
Graphics dont matter a single fuck if you need to play games at PS1 resolutions. Lords of the Fallen on base PS5 for example feels 480p 90% of the time
 

tr1p1ex

Member
Gameplay logic primarily requires CPU. Graphics primarily requires GPU. The cpu is rarely ever maxed out and even when it is, it is because of said gameplay logic (and often unoptimized, such as DD2). Not graphics. This claim that gameplay and graphics are competing for the same compute power is factually inaccurate.

Fighting games used to look better than the average game because there is nothing else that needs to be rendered outside of the fixed camera. Racing games are primarily about the cars and, like real life, the visual appeal of cars is a major selling point. So they just put a ton of resources towards the cars and secondary aspects like crowds and trackside details get lower priority. It has nothing to do with gameplay logic or your claimed lack of it. Sim racers actually have a ton of gameplay logic, and all that happens on cpu, independent of rendering. They still manage to look phenomenal.

Hellblade 2, a graphical benchmark, has barebones gameplay. Not because all the graphics took up the cpu. It doesn’t even need a high end cpu. The reason is the dev wanted a cinematic experience and gameplay was not as much a priority. The issue was a creative/financial/leadership choice, not a technical one.
Inaccurate.

YOu need to match cpu to gpu. YOu can bottleneck a gpu by not having enough cpu power left to drive it. Using more cpu power leaves less for gameplay logic.

Gpus can do more than graphics.

And we can even go farther back to the design of the hardware and say that current design of the big console hardware focuses more graphics than on gameplay than it could. They could go heavier on cpu and lesser on gpu for example.

The examples of the fighting and racing games show that if you hold steady the gameplay stuff you need to do or can do or want to do you can do more on the graphics front.

And no matter what gpu you have there's the frame rate (gameplay) vs graphical detail tradeoff. There's the more objects on the screen (gameplay at least in many cases) vs fewer objects graphical detail tradeoff. ...the list goes on.


and again, compared to back in the day, the content that used to be created in days to weeks is now, at higher graphical detail...created in months or longer. The tradeoff has either been less content, more expensive games, higher budgets and/or much longer to make a game. That's also part of the gameplay/graphics tradeoff.

BAck in the day, 1 artist could make a weapon in a few days. Eventually it turned into months to make a weapon. That's something hurts gameplay. It's been mitigated a large part by hiring armies of artists, raising prices of games, microtransactions and bigger louder more expensive hardware. But ...we're still left with far fewer game releases now.

Is that a gameplay tradeoff? I file it under that heading as well.
 
Last edited:
Top Bottom