• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The lack of repulsion for this "frame gen" thing is truly concerning

Durin

Member
Stopped reading there

No, that won’t happen

I don't think it will happen either, the latency hit for any game you run native or upscale that is under 60fps, will feel like garbage to play latency wise. Frame-gen is only useful when you're targeting 80+fps in games where the latency hit doesn't matter, and competitive games where it does can hit higher frames because they're optimized to be lightweight on a system.
 

RaySoft

Member
cerny.jpg

Most bang for buck is in ML now..
We've had "faking" in graphics since the start.. baked lights, mip/bump mapping, antialiasing, screenspace reflections etc, etc, etc.. Framegen is just keeping up the trend.
 
Last edited:

artsi

Member
I always turn on DLSS even if I can run native, because the image just looks so good.

I'm sure FG will be better over time just like the scaling.
 

Holammer

Member
At 30fps input lag is insanely high. Easily well over 150ms in many games.

At 60fps, it's far better, but you still got plenty of games with 100ms+ of input latency.
Most fighting games which usually are better than the average still have 4-6 frames of input lag (66.4 to 99.6 ms).
 

ReBurn

Gold Member
In the digital industrial revolution we're seeing limits on affordable frame capacity that raw power can provide just like we reached the limits of affordable capacity of human strength in manufacturing. AI upscaling and frame gen is the "work smarter, not harder" of resolution and framerate for graphics technology. Might as well get used to it because it's where GPU's are going to be focused for foreseeable future. Outrage and repulsion are irrelevant.
 

rm082e

Member
Stopped reading there

No, that won’t happen

Nvidia's own marketing is already encouraging it:

pkpLDyDmCIS3wK5R.jpg


At no point did they ever say you should only use it to boost games that are 60fps or higher. They want as many gamers as possible to use upscaling and frame generation in tandem. They want people hooked on their products.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
Honestly I don't get the hate. I used it in Dogma 2 to go from 70 or so frames to a constant 120 and that shit felt fantastic to me. Could swear I used it in Wukong as well, and the game felt great to me and even a bit too easy lol.

Nvidia's own marketing is already encouraging it:

pkpLDyDmCIS3wK5R.jpg


At no point did they ever say you should only use it to boost games that are 60fps or higher. They want as many gamers as possible to use upscaling and frame generation in tandem. They want people hooked on their products.
Should be pointed out that that bump is because DLSS + framegen, and not just framegen.
 
Last edited:

Buggy Loop

Member
I don't think it will happen either, the latency hit for any game you run native or upscale that is under 60fps, will feel like garbage to play latency wise. Frame-gen is only useful when you're targeting 80+fps in games where the latency hit doesn't matter, and competitive games where it does can hit higher frames because they're optimized to be lightweight on a system.

The only way this would work is if a technology like Reflex 2 can hijack the frame rendering queue at anytime to have low input. Say as they are making the duplicate frame you have an input, reflex 2 hijacks the frame to warp it and then fill rest with AI for low latency.

They kind of have the pieces of the puzzle right now for it but there's no indication as of now that they can combine both, nothing in the reflex 2 presentation indicates it'll combine with frame gen. But who knows in the future. I think adding AI computing time for a 30 fps game would be too ridiculous, better use that for upscaling anyway which will already bring you to a good framerate.

Nvidia's own marketing is already encouraging it:

pkpLDyDmCIS3wK5R.jpg


At no point did they ever say you should only use it to boost games that are 60fps or higher. They want as many gamers as possible to use upscaling and frame generation in tandem. They want people hooked on their products.

It doesn't start with 27 fps

It has upscaling which will bring it >40~60fps and THEN frame-gen is a possibility. Even Nvidia documentation says to not even bother until you hit 40 fps but even that is a low entry bar. AMD recommends 60 fps.
 
Last edited:

hinch7

Member
Not sure what the drama is. You need a solid base framerate for framegen to be a good experience (preferably 50 plus, with 60 and above being optimal). It doesn't increase performance, it a frame smoothing technology that aids a good solid framerate. It benefits those with high refresh rate monitors. Its not magically going to turn 30fps into 60/120 and be a playable experience. Thats going to be a laggy mess with a lot of artifacts.

With how DLSS scales these days with the transformer model its amazing when it all comes together working with all bells and whistles and playing on a HFR OLED display.

I'm going to bet the next generation of consoles are going to push these technologies (AMD/Sonys's equivelant FG/MFG) to achieve full RT just like we have with the RTX stuff we have today.
 
Last edited:

Newari

Member
Most of the people complaining that frame gen adds latency didn't even know what latency was before they heard about frame gen. Right now Cyberpunk with 4x frame gen has lower latency than it had at launch without frame gen and I don't remember people complaining that it is unplayable because of input latency.
And personal experience: Back in my poor days, I played on a 60Hz fixed refresh with V-sync enabled (Tearing is the worst visual artefact ever) and framerate regularly dropping below 60. Compared to that, cyberpunk with framegen at 65-70 FPS so barely above 30 fps source, felt good on 120Hz VRR screen .
 
Last edited:
I dont notice the artifacts, i dont care about input lag because i dont play counter strike, nor the remaining 2 games where it matters. I do however care about framerate.

??????????????

A lot of gaming illiteracy around here these days. Reminds me of how supposed "enthusiasts" also think gaming on the cloud is great. My casual gamer friends seem to care more about gameplay these days.
 

Soodanim

Member
As ever, the answer is more nuanced than the binary statements people easily fall into making.

If you're jumping up from a solid base of 60, it's free frames unless you're on M&KB where you're going to feel everything. Base 30 at least gives you visual smoothness but is far from ideal.

The problem isn't framegen, the problem is how it'a currently being used. It's supposed to be a supplement that enhances what you already have. But devs are trying to minmax and lean on it, which only ever leads to a worse experience. It doesn't help that nVidia are using it to flat out lie about the 50 series performance. It's like trying to live on multivitamin tablets instead of having a healthy diet.

If it hasn't happened already I wouldn't be surprised to see games be a locked 20 with framegen up to 60. Someone's daft enough to try it.
 

Kataploom

Gold Member
Seriously guys? You might as well turn motion interpolation on your chitty low-end LCD. I find it absolutely incredible the enthusiast community isn't appalled by this "tech". It's not only fake as hell, but introduces input lag and visual artifacts all over the place, as expected. I can tolerate these guys toying around with res upscaling 720p to 16K but you DON'T F¨¨¨Ck WITH MOTION.

Can't you see the ramifications of this? PS6 will likely have it. My god. Sony will be publishing games internally running at 20 FPS and "upscaled" to make it seem 100. They will market and parade the fake frame rate numbers all over the place, and the casuals won't even care. Game optimization will be officially dead, now also on consoles.

I find it absolutely hilarious Nvidia even mentions this thing when marketing their "super high end, astronaut level" card. It shouldn't even be in the same sentence as this bullshit. But I guess BEING REAL is too hard nowadays.
How not? Have you been on the internet this post month? Me and other have been blasting Nvidia for trying to fool us with it as "performance", which it isn't because it's an animation smoother, not a performance tool, it's like someone saying they are a by m sex beast just because he enjoys touching himself at nights, one thing is not like the other and it's been stated to hell and back already
 

GreatnessRD

Member
Death to fake frames!

It was cool when it was helping older hardware keep up, but now its just getting way too stupid. And its crazy people are going for this nonsense. But I don't expect less when folks gonna fight each other to buy $3,000 5090's tomorrow when its like a 15% uplift, lol
 

Sylonious

Member
What I have experienced so far is not quite good enough. Yes, when going from 70 to 120 or so fps it does look less choppy, but it also adds a slight blur that distinguishes the experience from a real high framerate gaming experience.

Supposedly Nvidia's transformer based DLSS4 solves this problem when using framegen. I'm hoping to see someone eventually do a video comparing CNN vs Transformer frame gen (both with upscaling and at native resolution).
 
Last edited:

Knightime_X

Member
This would be a HUGE concern if I was a superhero who can actually react to such extremely fast times.
It could be 5x larger and 99.999% won't be able to react on the luckiest day of their lives.

The optimization is dead meme has to stop, it's soo silly.
One day you'll be lmao at 4 frame gen games while playing your 256fg titles.
It'll be all right!
 
Last edited:

Rudius

Member
.
Of course it doesn't look like that. That shit was the most naive implementation. Do you know what bilinear interpolation is (for resolution scaling)? That's basically what those early smoothing features on HDTVs are. Whereas Nvidia's FrameGen is analogous to what DLSS scaling does.
Regular DLSS will give you better performance and lower latency. Frame Gen is only a visual improvement.
 

buenoblue

Member
If used right frame gen is great. I try to have minimum 65 but prefer 75-80 base FPS. Frame gen that up to 90-120 works great for me.

Remember your real FPS is half of frame gen FPS. So if your framegen frame rate is 70fps then your real frame rate is 35. This is to low and will feel bad
 

Moochi

Member
Raster is going to be prioritized less and less. Inference is the new hotness. Expect AI architecture to take over more of the PCB with each generation. The good news is, when games are developed with distilled models, you will have handhelds running at 4k 120 fps just sipping on the battery.
 
Nvidia's own marketing is already encouraging it:

pkpLDyDmCIS3wK5R.jpg


At no point did they ever say you should only use it to boost games that are 60fps or higher. They want as many gamers as possible to use upscaling and frame generation in tandem. They want people hooked on their products.
I don't think they ever came out and said that but I could be wrong.

BUT it's also true. Frame Gen feels like shit unless the original FPS is high already. A 30fps game turning into 90 with FG feels worse then a game simply running at 60 IMO.
 

Rudius

Member
Poor reading comprehension? Don't understand analogy?
TV interpolation to frame gen is not analogous to bilinear upscale vs DLSS. DLSS can improve how a game feels to play with a higher real performance while looking the same (sometimes better); regular upscalers do nothing for performance. Frame gen, however, from the user perspective doesn't do anything that TV interpolation doesn't do: it just put new frames between regular frames with some latency cost, but it does that with much better quality and less latency penalty.
 

Zacfoldor

Member
Whenever I "fix" an lame's TV they always have frame gen on smooth and all their movies have the soap opera effect. It's the sign of a lame.

In games it's the same. I can smell it like "dried" piss in a couch.

Some people can't tell. Once you get IQ-woke there ain't no going back.
 
Last edited:

mcjmetroid

Member
input lag and major screen tearing when I use it.
It's shite and I agree with the OP.
I don't see how any gamer doesn't notice this.
 

YeulEmeralda

Linux User
Most of the people complaining that frame gen adds latency didn't even know what latency was before they heard about frame gen. Right now Cyberpunk with 4x frame gen has lower latency than it had at launch without frame gen and I don't remember people complaining that it is unplayable because of input latency.
And personal experience: Back in my poor days, I played on a 60Hz fixed refresh with V-sync enabled (Tearing is the worst visual artefact ever) and framerate regularly dropping below 60. Compared to that, cyberpunk with framegen at 65-70 FPS so barely above 30 fps source, felt good on 120Hz VRR screen .
Everyone is pretending that they play Street Fighter competitively.

In a single player cinematic game who the fuck cares?
 
TV interpolation to frame gen is not analogous to bilinear upscale vs DLSS. DLSS can improve how a game feels to play with a higher real performance while looking the same (sometimes better); regular upscalers do nothing for performance. Frame gen, however, from the user perspective doesn't do anything that TV interpolation doesn't do: it just put new frames between regular frames with some latency cost, but it does that with much better quality and less latency penalty.
9QGiDqa.png

Comparable in certain respects, i.e. an analogy doesn't mean things need to be comparable in all respects.

Keep in mind the initial question is about how it looks not how it performs. As presumably existensmaximum existensmaximum already knows his TV's built in motion modes aren't suitable for gaming. Of course, if he were asking that, then obviously FrameGen is much better latency-wise compared to using his TV to generate frames, not to mention that the apt comparison isn't the final output resolution for performance when considering native-render final res output vs DLSS final res output. It would be low-res render bilinearly scaled to final output versus low-res render DLSS scaled to final output. In which case obviously the bilinear resize will outperform DLSS, it'll just look like shit. So, you're also wrong on the (irrelevant to begin with) points you are trying to make in the first place.

DLSS = 'fake but convincing resolution', FrameGen = 'fake but convincing framerate'

Bilinear resize = 'blurry unconvincing fake resolution', Soap Opera Mode = 'blurry unconvincing fake frames'

I hope that clarifies the nature of the analogy. Thank you for coming to my TED Talk!
 

Rudius

Member
9QGiDqa.png

Comparable in certain respects, i.e. an analogy doesn't mean things need to be comparable in all respects.

Keep in mind the initial question is about how it looks not how it performs. As presumably existensmaximum existensmaximum already knows his TV's built in motion modes aren't suitable for gaming. Of course, if he were asking that, then obviously FrameGen is much better latency-wise compared to using his TV to generate frames, not to mention that the apt comparison isn't the final output resolution for performance when considering native-render final res output vs DLSS final res output. It would be low-res render bilinearly scaled to final output versus low-res render DLSS scaled to final output. In which case obviously the bilinear resize will outperform DLSS, it'll just look like shit. So, you're also wrong on the (irrelevant to begin with) points you are trying to make in the first place.

DLSS = 'fake but convincing resolution', FrameGen = 'fake but convincing framerate'

Bilinear resize = 'blurry unconvincing fake resolution', Soap Opera Mode = 'blurry unconvincing fake frames'

I hope that clarifies the nature of the analogy. Thank you for coming to my TED Talk!
DLSS fake resolution gives you real performance, while frame gen fake frames gives you fake performance. And bilinear resize was never fake resolution, it never added "fake detail", even of the worst possible quality, while TV interpolation already increased fake fluidity, even if done poorly.

Perhaps in the future frame gen will provide a type of "fake performance" with the evolution of that warping NVidia showed. Then it would be something TV interpolation never did, even poorly.
 

Rudius

Member
DLSS is a killer technique (it's killing AMD). NVidia is trying to convince people frame gen is comparable to that in some form, but it's not: it's far less usable, never better than "native", only visual, decreases performance. It's not even close.

There is a reason Sony is trying to copy DLSS on the Pro, but not even bothered with frame gen.
 

A2una1

Member
You know what I find repulsing?

The amount of people who clearly haven’t tried it. It works, shockingly well, I don’t notice any latency (because it’s not the same as frame interpolation)


just like. Take a deep breath, and try the scary new technology before you develop such an emotional reaction to it.

I mean jeez I’ve seen tests that show frame gen on 40 series cards are still hitting lower latency than native rendered console games.

Chill out man. It’s good. Works great.
Except when it doesn't. It works great in cyberpunk but in Indiana Jones I needed to deactivate it since it felt to "smeary" for my liking. Given I "only" use a 4090 with a 4k oled. So no MFG. But the whole frame generation debate will always be down to personal impression and on a game to game basis. The downside is no review can tell you how playing a game with that specific card will feel.
 
Last edited:
Top Bottom