• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why do some people look at frame generation as a bad thing?

SNG32

Member
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong etc. it’s usually used in the performance/rt modes or performance modes. I mean if the game runs smoother what’s the big deal using the technology. I would rather have so called fake frames then it running trash.
 

SNG32

Member
Most console gamers (including myself) don't know what the fuck that is, and wouldn't care. I hate it that PC plebs have invaded the discussion and turned it to spec talk *spits* instead of game play talk.
I agree some people think they are too elitist if the game runs better and smoother that should be a good thing.
 

Skifi28

Member
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong
Black Myth does, to pretty bad results both visually and in latency. The other two don't as far as I'm aware. There's very few console games with frame gen.
 
You kind of said it already. Fake frames at the core plus you add input lag and artifacts and it’s obvious why people don’t like it.

When you have a $2000 GPU only hitting 30fps without DLSS it makes sense that people would be annoyed.

Most console gamers (including myself) don't know what the fuck that is, and wouldn't care. I hate it that PC plebs have invaded the discussion and turned it to spec talk *spits* instead of game play talk.
It’s not strictly PC people…

The “tech talk” leaked into console big when outlets like Digital Foundry brought more and more focus to the 1080p PS4 vs 900p XBO conversation. Then you had the mid gen refresh consoles which cemented these tech focused conversations in the console space. Now it’s common place, but that’s not at fault of PC Gamers…
 

Puscifer

Member
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong etc. it’s usually used in the performance/rt modes or performance modes. I mean if the game runs smoother what’s the big deal using the technology. I would rather have so called fake frames then it running trash.
Got any proof those games use frame generation? I've not heard that before now
 

Heimdall_Xtreme

Hermen Hulst Fanclub's #1 Member
They will call me strange, but I sometimes prefer to use Quality mode in some games, it gives them a cinematic approach and high graphic quality on PS5.

Only if warranted do I use performance mode.
 

MiguelItUp

Member
Because a lot of PC gamers want "the best", and if you're not getting those frames natively, then it just doesn't seem right.

Really though, I know some have said it affects performance and visuals more than they'd like.
 

Gojiira

Member
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong etc. it’s usually used in the performance/rt modes or performance modes. I mean if the game runs smoother what’s the big deal using the technology. I would rather have so called fake frames then it running trash.
I mean you say this, but they dont. Maybe the Pro specific modes do but base PS5 versions do not.
PC versions however, sure.
The reason people dislike it is frame generation introduces ghosting and increases input lag.
 

Gaiff

SBI’s Resident Gaslighter
Most console gamers (including myself) don't know what the fuck that is, and wouldn't care. I hate it that PC plebs have invaded the discussion and turned it to spec talk *spits* instead of game play talk.
What a low-effort post. Tech talk is simply another aspect of gaming discussion. Do you also bitch people aren’t talking about gameplay when they’re talking about the music or story?
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong etc. it’s usually used in the performance/rt modes or performance modes. I mean if the game runs smoother what’s the big deal using the technology. I would rather have so called fake frames then it running trash.
Rift Apart and Spider-Man 2 don’t use frame generation. The reason frame generation is controversial is twofold.

1. It increases the frame rate but does nothing for the latency. The result is a game that looks smoother but doesn’t respond accordingly. Even worse, it adds latency. Without latency reduction technologies such as as Reflex, it’s quite bad at lower frame rates.

2. NVIDIA’s marketing is trying to muddy the waters and pretend generated frames are the same as regular frames. A higher frame rate will almost always result in a latency reduction, frame generation is the opposite. Jensen misled people by claiming the 4090 and 5070 are equal, it’s bullshit. While your 5070 might give a smoother frame rate, the 4090 will be much more responsive. Input latency matters more than frame rate, especially at lower levels. When the latency gets extremely low, diminishing returns begin and a better frame rate becomes preferable.
 

rm082e

Member
It's a new technology that's improving as time goes on, but it needs more work. If they can get the added latency down to 5ms or less, I think most of the argument against it goes away. But they're not at that point yet.

Taking a game that runs at 90fps and bumping it to 180 would be fine if it didn't add lag. The problem/worry is that developers will use it as a crutch to get games that run at 30 up to 60+.
 

Jigsaah

Member
You gotta take it on a game by game basis. Some games are ok with frame gen if you can stand the latency it causes currently. There's also some visual quality that you'll lose by going with FG. Ghosting is often present in a lot of games with frame gen. It's up to the user to decide whether that's acceptable or not.

Personally, I'll use frame gen if I'm playing at 4k and I can't get the frames above 60 fps. I'm too used to higher framerates that 60 fps look remarkably worse. Luckily, most games I play don't require this with my setup.
 
Frame generation is a crutch and a compromise that does not reach parity in picture quality or responsiveness compared to real frames and often gets abused by Devs.

Stupid question.
 
Last edited:

nkarafo

Member
DLSS, FSR or LS? I'm not into FG myself, but I find the DLSS solution visually spotless.
I don't have as much of an issue with DLSS upscaling, other than it's really blurry in some games.

I just don't like fake frames. It's almost like those "motion plus" filters on TVs. I want my frames to be raw.
 

Fbh

Member
Maybe on high end PC's with the latest hardware it's good.

I don't want this shit on consoles and lower end rigs though. Image quality has already gone down the drain with all the resolution upscaling adding notorious artifacting and smearing as it struggles to deal with Ps3 era resolution. I don't want even more artifacting from fake frames too. Not to mention fake 60fps doesn't feel as responsive as real 60fps so you are taking away a big reason to even target 60fps to begin with.

I want games with nice IQ running at real responsive 60fps. Not fake resolution with fake framerates that looks ok in still images then starts breaking apart and smearing everything as soon as you actually move.
 
It's laggy and smudgy and adds to the visual artifacts created by FXAA and various other performance saving bullshit. Play a game from the early 2000's at 4k maxed out, then play a game from 2020 and you will see.
 
Last edited:
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong etc. it’s usually used in the performance/rt modes or performance modes. I mean if the game runs smoother what’s the big deal using the technology. I would rather have so called fake frames then it running trash.
You just answered your own question... some people don't want fake frames. Just a preference
 

TIGERCOOL

Member
When you have a $2000 GPU only hitting 30fps without DLSS it makes sense that people would be annoyed.
You can top out at 30fps if you want to enable path-tracing without framegen or upscaling AI... I guess? Raster performance is simply hitting the point of diminishing returns when it comes to raytracing, unless you wanted GPUs to be even more prohibitively expensive. If you want to forgo those features completely, most budget cards perform great with limited/no RT (handful of exceptions - as has always been the case).

FG is pretty new tech. It isn't perfect yet. We'll see how things shape up with reflex 2.0 and the 5000 series nvidia cards. DLSS 1.0 was once shit on from a great height, and is now an industry standard for performance and IQ.
 
Last edited:

Holammer

Member
IMO framegen has no place in competitive FPS games, 16bit era games with frame perfect timing or anything controlled with a mouse. Maybe Reflex works, but I haven't seen it yet.
But everything else? It works fine, great even. With consumer televisions increasingly offering 120hz and higher, I think we're going to see 60x2 or 40x3 options in next-gen products. Maybe even on the Switch 2.
 

Mr.Phoenix

Member
I feel its one of those things people just bitch about.

Yes, there are a lot of little random subjective things that gamers bitch about, and you should learn to ignore most of that. yup... gamers are petty like that.

Eg.

Game A releases with a native input lag of 80ms. Everyone plays the game, its fine, no one complains.

Game B releases with a native input lag of 30ms, then tacks on FG and becomes 65ms. Gamers goes up in arms and start calling for heads.

The funny thing is, that Game B still has a better response feel than Game A. Even with framegen.

The crazy thing is that input lag as a whole is measured in ms... and we got people here telling you that when a game goes from 2 frames of input lag to 3 frames their game has become broken.
 
Last edited:

Raelgun

Member
A good chunk of PS5 games use it like Ratchet and Clank Rift apart, Spider Man 2, Black myth wukong etc. it’s usually used in the performance/rt modes or performance modes. I mean if the game runs smoother what’s the big deal using the technology. I would rather have so called fake frames then it running trash.
I've had a RTX4090 since launch and have used frame generation on numerous games for nearly 2 and a half years... and for the most part its fantastic!

The picture quality is generally great, and the massively boosted frame rate feels like magic! In real use I don't even find the added latency that bad - the only not great experience I had was with Silent Hill 2 Remake where the FG artifacts were quite noticable.

But I've kind of realised that the majority of people who hate frame generation havent even used it - and hold onto latency as the biggest problem...

The real biggest issue I think, that most people are overlooking, expecially with the RTX50 series launch - is that hardly any games support frame generation!


The vast majority of DLSS enabled games dont have frame generation - I think in all this time I've only played about 30 games that natively supported FG - out of the ~300 games that support DLSS (upscaling).

With the RTX50 series cards Nvidia confirmed that only 75 games at launch will have DLSS4 multi-frame gen support - that almost a non existant fraction comparted to all the avaliable games out there!


So you could have a fancy new expensive RTX5090 - but for the vast vast majority of games you could play, hardly any will have frame gen support, let alone multi frame gen...

It kind of feels like a lot of people are deluded into thinking frame generation will work on "every" game automatically...
 
Last edited:

JackMcGunns

Member
On the RTX 40 series, it adds input lag, defeating the purpose of high framerates, but with the RTX 50 series, it's way better. Remember when gamers were Native res or die? Now everyone accepts DLSS, I predict the same will happen when frame generation is nailed.
 

Zannegan

Member
Anything that increases latency is going to be controversial, to say the least. It's like streaming games--some find it perfectly acceptable, no different than playing natively; others feel it to be a laggy to the point that it becomes "unplayable." The former think the latter are being dramatic and/or closed minded, and the latter think the former are in denial.

Your reaction depends on your sensitivity, I guess, and no one is going to be more sensitive to lag than a M/KB player.
 
Last edited:

Minsc

Gold Member
Anything that increases latency is going to be controversial, to say the least. It's like streaming games--some find it perfectly acceptable, no different than playing natively; others feel it to be a laggy to the point that it becomes "unplayable." The former think the latter are being dramatic and/or closed minded, and the latter think the former are in denial.

Your reaction depends on your sensitivity, I guess, and no one is going to be more sensitive to lag than a M/KB player.

So does playing in quality mode - that increases latency too. Yet many people prefer it over a higher framerate/lower latency performance mode.

Then again there's almost nothing that's not split on PC. Some prefer buying a 4090 to have RT, others get the same card and turn all effects off just to have 240fps, or even drop to 1080p with all effects off for the same reason. The less graphic options you can enable the better for multiplayer!
 
Last edited:

Nankatsu

Gold Member
Because they are selling you a lie.

red pill GIF
 

cireza

Member
Because it sucks and is a solution to a problem that doesn't exist, and is there only because developers strayed from the right path and/or have the wrong priorities.

At which point exactly does it makes sense when you read this ?

My game doesn't run well. Rather than optimizing it by all levers I have available (better code, simpler assets, removing useless features etc...), I am instead going to invest even more power into generating the missing frames, probably at the cost of lowering further the level of quality I reached right now.

Answer : it never does. We lost track of what common sense is.
 
Last edited:

cormack12

Gold Member
If you're the type of gamer who pushes gaming on a high end rig and notices every frame, then you can notice DLSS and Frame Gen almost immediately - i dont see how you don't tbh. If you can tolerate either of those then you can tolerate console. Game native brothers 🤘

If you're a gaming tourist that has money to just plough into an expensive gaming rig but are just the type to open settings, max out what you can then play but not really see the difference between high and medium settings then yeah these things allow you to still psuh max settings and you'll feel OK using framegen and dlss etc.

At least on pc you have a choice. On console it's kind of imposed and likely will be forever forward now.
 
Top Bottom