• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The lack of repulsion for this "frame gen" thing is truly concerning

NomenNescio

Gold Member
Seriously guys? You might as well turn motion interpolation on your chitty low-end LCD. I find it absolutely incredible the enthusiast community isn't appalled by this "tech". It's not only fake as hell, but introduces input lag and visual artifacts all over the place, as expected. I can tolerate these guys toying around with res upscaling 720p to 16K but you DON'T F¨¨¨Ck WITH MOTION.

Can't you see the ramifications of this? PS6 will likely have it. My god. Sony will be publishing games internally running at 20 FPS and "upscaled" to make it seem 100. They will market and parade the fake frame rate numbers all over the place, and the casuals won't even care. Game optimization will be officially dead, now also on consoles.

I find it absolutely hilarious Nvidia even mentions this thing when marketing their "super high end, astronaut level" card. It shouldn't even be in the same sentence as this bullshit. But I guess BEING REAL is too hard nowadays.
 

LiquidMetal14

hide your water-based mammals
I can agree with you but appreciate the help out of all the advancements.

I think the positives outweigh the negatives. The big negative is price with people. Most can't or won't spend that kind of money. I was there at one point.
 
What's more curious is to see console gamers talking about input lag now, i mean if you worry so much about that just don't get a console.

I'm using Framegen lately to play at 144hz and it's very smooth, you might lose a bit of image quality due to artifacts or a slightly softer image, but hardly noticeable if you start from 60-70 FPS, and the same goes for input lag, from 30 to 38ms, i wouldn't use it on some specific games tho, but very playable for pretty much every game, first person shooters included.

At the same time you gain a huge amount of motion clarity, to go from 70 fps to 140 fps is another world in smoothness
 

T4keD0wN

Member
I dont notice the artifacts, i dont care about input lag because i dont play counter strike, nor the remaining 2 games where it matters. I do however care about framerate.

I personally mind rendering at lower res (upscaling) since it fucks up the image (compared to DLAA) rather than frame gen which makes everything smoother without any perceivable visual degradation, at least to me. Upscaling is far more detrimental than frame gen since reflex or other solutions almost make up the difference.
 
Last edited:
The basic issue with framegen is that it's being marketed as performance which is simply a lie.

Framegen is an add on that can make already good framerates better/great but if you try to replace base performance with it you're gonna have a miserable unresponsive artifact ridden time.

The 5070 = 4090 slide was such a blatant middlefinger to the "stupid average customer" it's hard not to hold that against Nvidia.
 
Last edited:

Pejo

Member
You might as well turn motion interpolation on your chitty low-end LCD.
This has been my thought about it since I first found out how it works. We're all gonna be running around with Soap Opera effect. 60 FPS should be the minimum that devs should shoot for, with options for frame gen if you want to go higher. But I am kind of wary that we'll get sub 30FPS and "fix it" with frame gen instead. Particularly on consoles.
 

viveks86

Member
It should be repulsive if it’s objectively bad and for no other reason. Is DLSS 4 framegen objectively bad? (I actually don’t know the answer to that, so if anyone posts evidence instead of feelings, that would be highly appreciated)
 

Prekk

Member
I agree. It's embarrassing.

But I feel the same way about upscaling as well.
A decade of 4k marketing bullshit and now 1000 euro graphics cards can't render at native 4k.

We shouldn't be talking about frame generation; we should have super low latency high frame rate gaming.
We shouldn't be talking about upscaling either; we should be able to supersample everything.
 
Last edited:
The 5070 = 4090 slide was such a blatant middlefinger to the "stupid average customer" it's hard not to hold that against Nvidia.
Someone who is aware and excited about the 50-series yet is dumb enough to not be informed about the revealing info that's been coming out about them, must be a subgroup of a subgroup of a subgroup.
 

64gigabyteram

Reverse groomer.
Vote with your wallet if you feel strongly about it, if the market agrees Nvidia will adjust.
It's not about Nvidia or the market. Framegen as a technology is fine IMO and I think it should be implemented

the problem are the devs who will take this as leeway to skip off on optimization in later games. We're already seeing it with Indiana Jones now. Imagine how much more emboldened they'll become when they see a technology capable of making frames out of thin air.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Seriously guys? You might as well turn motion interpolation on your chitty low-end LCD. I find it absolutely incredible the enthusiast community isn't appalled by this "tech". It's not only fake as hell, but introduces input lag and visual artifacts all over the place, as expected. I can tolerate these guys toying around with res upscaling 720p to 16K but you DON'T F¨¨¨Ck WITH MOTION.

Can't you see the ramifications of this? PS6 will likely have it. My god. Sony will be publishing games internally running at 20 FPS and "upscaled" to make it seem 100. They will market and parade the fake frame rate numbers all over the place, and the casuals won't even care. Game optimization will be officially dead, now also on consoles.

I find it absolutely hilarious Nvidia even mentions this thing when marketing their "super high end, astronaut level" card. It shouldn't even be in the same sentence as this bullshit. But I guess BEING REAL is too hard nowadays.
Not scary but a bit snake oily…

eyh8Oe6.jpeg
9dh1uHi.jpeg


NL919VA.jpeg
l7gEuYA.jpeg
 

rm082e

Member
I just heard Will Smith from PC World say he hated how frame gen felt on the 4090, but he tried the updated version as part of their 5090 review and said it was really impressive. He mentioned if you're used to playing with KBM, you're still going to be put off by the added lag. But with a controller, a lot of people aren't going to notice it. He was specifically pointing to Star Wars Outlaws.

This seems like DLSS where the first Iteration wasn't great, but as the AI improved it got a lot better. Seems like the value will be taking a game that can run at 60 and bringing it up to the max refresh rate of the monitor.

I'm all for having the option. If you don't like it, don't turn it on. I don't turn on TrueMotion on any of the TVs I have in the house.
 

ResurrectedContrarian

Suffers with mild autism
I can tolerate these guys toying around with res upscaling 720p to 16K but you DON'T F¨¨¨Ck WITH MOTION

I agree. Motion interpolation is bad tech, but I don't as much mind various kinds of upscaling.

I remember when early flatscreens started introducing that horrible frame interpolation feature. It's hands down the worst feature I've ever seen for TV or video, just absolutely ruins everything. I'm much rather have to play all games at 30FPS for the rest of my life than ever touch that tech, even if it pushed things magically to 120 FPS or something.

It also has been used in VR, and is equally annoying there ("reprojection" or whatever they call it now). While it allows some games with higher requirements to just barely work on lower specs -- and that might be worth it when your rig is behind, I've done it before -- the degradation is noticeable and extremely annoying. Intentionally making games to use that kind of feature -- instead of it being a backup when your computer is below spec -- is insane.
 

Stuart360

Member
Framegen saved Star Wars Outlaws for me, which had ridiculously high cpu requirements (in certain parts of the game. It was a locked 60fps with framegen though, even at max settings with full RT.

In the long run it could be a problem though as upscaling has already had a major effect on game optimization, and throw in framegen now and it could be even worse. Already seeing reccomended specs on Steam mention upscaling and framegen.

Plus new versions of these techs being locked behind new gpu generations, especially from Nvidia.

It kind of sucks really.
 

Crayon

Member
Take a performance ai upscaling factor and how many of the pixels are traditionally rendered there, then add 3 full generated frames... and what percentage of your pixels are ai now???

I've tried fg and if people like it, that's fine, but it seems pretty fucking far away from anything you'd call enhancing performance. I suppose if you have a hi refresh monitor and really really want to use it... It just doesn't make a lot of sense to me.
 

Gaiff

SBI’s Resident Gaslighter
Console gamers play at 30fps and 150ms+ of input lag for 2 decades. No problem.

Frame gen introduces 10-20ms of input lag. This is serious business!

Just test it yourself and if you don’t like it, turn it off.
 
Last edited:

damidu

Member
yeah, its pretty much a scam. haven't seen a flawless implementation of it on 4000s, and doubt it'll fair any better with 3X times more fake frames.
they way its shilled by likes of DF is sickening.
 

Larxia

Member
Man, if only it was optional.
Seems like you didn't read anything in this thread. Everyone is fine with framegen and upscaling existing as options.
The problem is how it's marketed and how it will most likely (surely) become the norm in the future, where games will use all these tech to reach their target framerate, like 60 fps, instead of trying to do any native optimization with the engine. At this point it won't be an option anymore, the base framerate will be rendered using all these techs.
 

Topher

Identifies as young
I really can't tell any difference in input latency when I use frame gen. I just notice the frames are a lot higher. At the same time, I can't say it makes a marked improvement on my gaming experience either. Perhaps if it was taking a game that was struggling in the 30s to get to 50+, but I haven't encountered that scenario yet.

Either way, I'm not see why I should feel "repulsed" by it at this point.
 
Last edited:

64gigabyteram

Reverse groomer.
At 30fps input lag is insanely high. Easily well over 150ms in many games.

At 60fps, it's far better, but you still got plenty of games with 100ms+ of input latency.
It did not feel that bad back when I was playing on my Xbox Series S
 

Danny Dudekisser

I paid good money for this Dynex!
I get people being okay with it if they’re a casual gamer and just want their annual Madden and Callduty fix. But It’s embarrassing that so-called enthusiasts frequently don’t know or don’t demand better. This, along with the usual scaling magic, will 100% make gaming worse as we move forward. The tech might get better, but relying on band-aids to this extent is not a good thing.
 

Buggy Loop

Member
Frame gen introduces 10-20ms of input lag. This is serious business!



Even AMD cards natively get more latency than Nvidia, without reflex / anti-lag, just native latency.

fortnite-latency-4070-ti-perf.png


A 4070 with lower framerates than a 7900 XT has 36.8ms less, or 22ms against a 7900 XTX.


Or with reflex / anti-lag on (first version by the dates of article)

A 3070 Ti @ 4k, 109.2 fps VERSUS a 6700 XT @ 1080p, 211 fps
Has
17.1ms latency vs 21.9ms.

So if frame gen 12ms added, over the already latency savings reflex already does over native, is unacceptable, then AMD is unacceptable, then consoles are VERY unacceptable.
 

a'la mode

Member
do they really????? Input lag can't be that high on console.

No, not really, although in some bad configurations it technically could be true. At 30 fps frame time is 33 ms and both Xbox and PS have sub-10 ms polling rates, so you'll get at least that, but there are so many additional variables that goes into the final input lag.
 

Miyazaki’s Slave

Gold Member
Frame gen is a tool that allows developers to release games for the masses that look "close enough" to what they are working with in house. Realistically 98% of the "player base" could not afford the hardware required to run software at a performant level natively, the devs do not have time to spend on "platform optimization" since the expectation is that they ship on anything that has an internet connection, and no one (the players, the investors, or the industry) has patience any longer.

My hope is that the negative side effect of the tech will get better (great quote from earlier the thread...."you have to spend enough time guessing into the future to render the past" or something close to that) as the tech ages or it won't and the majority of players will not care and will continue to demand higher and higher fidelity/immersion while expecting the buy in cost to be negligible comparative to previous generations of hardware.

Those damn clouds REALLY need to be yelled at today!
 
Last edited:

Fafalada

Fafracer forever
Frame gen introduces 10-20ms of input lag. This is serious business!
Frame gen lag is proportional to the source framerate (and latency). If you frame gen that 150ms game - it'll be a lot worse than 170ms.
Not to mention that results are always at least Nx worse than the target (N)framerate even if frame-gen was 0 latency in of itself.
 

Buggy Loop

Member
No, not really, although in some bad configurations it technically could be true. At 30 fps frame time is 33 ms and both Xbox and PS have sub-10 ms polling rates, so you'll get at least that, but there are so many additional variables that goes into the final input lag.

oh boy

This is exactly the kind of misinformation that makes these threads pop up

You're calculating display latency. Not a rendering pipeline end to end from input to display.




Digital foundry took the latencies from input to frame movement on PS5 and LDAT on PC.

5ldp623.png


They even used a low latency mouse for call of duty cold war and fortnite since they work on PS5. Same mouse was used on PC.
 

Hot5pur

Member
I don't mind stuff like framegen, but not as a selling point, and it has to be optional (which it is).
The way Nvidia is marketing their card, is as though framegen is a replacement for real frames - that's is completely BS. A) Artefacts in many games 2) Latency.
They strategically work with CDPR to get a fancy game like Cyberpunk running with it, being the best example of the tech, but in reality most other devs can't even get it to work right a lot of the time in other games.
They do the same thing with raytracing, which hardly makes a difference in most games and even then only in certain scenarios. Again, Cyberpunk is intentionally chosen and collaborated on to show this - it's all marketing.
The only thing I'll give them props on is DLSS reconstruction. That shit works well and I usually turn it on, even though there are occasional artifacts, like hair or meshe-like structures with thin features looking wonky.

TL;DR
Don't fall for FG marketing, a lot of the time it doesn't work right
Don't fall for Raytracing marketing, more often than not it's hardly noticeable
DLSS image reconstruction is nice
 
Framegen gon' be lit af for pushing already high frames up to 1000hz, I get to decide myself when to use it just like DLSS. Don't really care what trickles down to the console ecosystems.
 

baphomet

Member
The only thing this should be used for is maxing high refresh rate monitors from an already high frame rate.

You want 1000hz monitors? This is how you get there.

Boosting an already low starting frame rate (under ~75fps) with it is total shit though. Sadly that's without a doubt what you'll be getting with next-gen consoles.
 
I've never experienced playing a game with frame generation. Does it really look like motion interpolation on a TV, with the soap opera effect?

It can't possibly look that horrendously bad, can it?
 
I've never experienced playing a game with frame generation. Does it really look like motion interpolation on a TV, with the soap opera effect?

It can't possibly look that horrendously bad, can it?
Of course it doesn't look like that. That shit was the most naive implementation. Do you know what bilinear interpolation is (for resolution scaling)? That's basically what those early smoothing features on HDTVs are. Whereas Nvidia's FrameGen is analogous to what DLSS scaling does.
 
Last edited:
Top Bottom