• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Should AI Frames be considered legitimate in FPS Performance Comparisons

Should AI Frame Generated Frames be considered

  • Yes - A frame is a frame man

    Votes: 55 17.0%
  • No - Fake frames should not be considered

    Votes: 206 63.6%
  • Other/Depends

    Votes: 24 7.4%
  • Both should be considered

    Votes: 39 12.0%

  • Total voters
    324

PaintTinJr

Member
Games are not a passive movie, they are an interactive experience in which motion feedback is a combination fluidity of motion on screen and latency of the motion feedback.

DF for better or worse took major umbrage with PS3 using triple buffering in the 360/PS3 gen, claiming that adding a second frame of latency to provide locked 30fps was worse than the a frame less latency on 360 and going typically between 24-30fps with torn frames too on the 3rd instalment of AC onward.

So if adding 1 frame of latency at 30fps wasn't cool using native frames, why would adding more than 5 more frames of latency with a fake +200fps be okay on master race PC to detract from the reality of native 30fps with a frame of latency when doing upscaled 4K with RT on a RTX 5090?

The only reason it is being sold as okay is because it is the only way to use the tensor cores between the 1ms of upscale on each of the native 30fps frames.
 
Last edited:

Seomel

Member
How is this a question noe? Theres been benchmarks in different categories before. DLSS on/oc/fg etc. This is the same
 

Brakum

Member
In comparisons? Sure as long as both are using them, otherwise it's not a fair comparison. Comparing the 5080 with the 5090 both with the same features? Fine saying the 5070=4090 no.
 

SScorpio

Member
Nonsense. Frame gen frames are not generated by the game engine.

These frames are interpolations that occur between two real frames to create the appearance of a higher frame rate.
It will be interesting to see how this feels since it can react to input, so character and camera movement could respond between "real" frames. But actions lime a weapon or jump would need to be fully proceesed.
 

Klik

Member
30 and I hate 30 fps. The judder at 60 fps FG from 30 is 🤢
Wtf what??

I have shitty cpu with rtx 3xxx series and with DLSS i can get around 65-70fps and without dlss around 30-35.. its a huge difference for me,while looking almost the same
 

nikolino840

Member
We have been doing comparisons for years where the native resolution is always used + the method used for rescaling because it would change now? With the frames it would be the same, what is the real performance and what is it with AI.
With mixed feelings tho...i remember threads about mocking series s low native res but now i have seen defending low native res in ps5 pro...i m not console wars...we just need a law or a guide about wich one is more important 😀
 

Minsc

Gold Member

So Wukong doesn't feel any smoother at 240fps than at 29fps on a 5090 at 4k?

I'm pretty sure the game is literally rendering more frames even with frame gen w/ DLSS due to it probably reducing the render resolution as one of the tricks in its bag does it not? So even without framegen you're getting more frames, it's just framegen pushes that from like 60 to 240.
 
Last edited:
So Wukong doesn't feel any smoother at 240fps than at 29fps on a 5090 at 4k?

I'm pretty sure the game is literally rendering more frames even with frame gen w/ DLSS due to it probably reducing the render resolution as one of the tricks in its bag does it not? So even without framegen you're getting more frames, it's just framegen pushes that from like 60 to 240.
I don't blame you for falling for Nvidia's marketing. They are really good at fooling people. Go back and read the notes on the chart because Nvidia are comparing unlike things. That's why your argument makes no sense. It's not 29 fps vs "240 fps". That 29 fps is derived from native 4k. The "240" fps is derived from a combination of dlss performance, Nvidia reflex, plus multi frame gen.

The question you should be asking us does BMW with DLSS performance and reflex feel smoother than Dlss performance, reflex and multi framegen. The answer is yes because dlss plus reflex doesn't have the additional latency penalty of frame gen.
 
Last edited:

Minsc

Gold Member
I don't blame you for falling for Nvidia's marketing. They are really good at fooling people. Go back and read the notes on the chart because Nvidia are comparing unlike things. That's why your argument makes no sense. It's not 29 fps vs "240 fps". That 29 fps is derived from native 4k. The "240" fps is derived from a combination of dlss performance, Nvidia reflex, plus multi frame gen.

The question you should be asking us does BMW with DLSS performance and reflex feel smoother than Dlss performance, reflex and multi framegen. The answer is yes because dlss plus reflex doesn't have the additional latency penalty of frame gen.

Not falling for anything, I responded to someone saying the 240fps does not feel smoother than 29. That's inherently incorrect, because there's literally more frames in the 240fps option (even when you remove all the generated ones).

Or at least that's what I'm trying to say - maybe they agree with me there and are only talking about AI frames - but DLSS on vs DLSS off - DLSS on literally is smoother.
 
Last edited:

PaintTinJr

Member
Not falling for anything, I responded to someone saying the 240fps does not feel smoother than 29. That's inherently incorrect, because there's literally more frames in the 240fps option (even when you remove all the generated ones).

Or at least that's what I'm trying to say - maybe they agree with me there and are only talking about AI frames - but DLSS on vs DLSS off - DLSS on literally is smoother.
The latency in the 240fps is exactly 1 frame duration of 30fps (33ms) meaning it is equal to an optimal 30fps double buffered renderer without FG/MFG in latency - and by extension equal in motion feedback.
 
Last edited:

Minsc

Gold Member
The latency in the 240fps is exactly 1 frame duration of 30fps (33ms) meaning it is equal to an optimal 30fps double buffered renderer without FG/MFG in latency - and by extension equal in motion feedback.

But that's not true is it? Shouldn't part of DLSS be reducing the render resolution to create more (real) frames? Unless you want to argue that at 1080p and 90fps the latency is the same as 2160p and 30fps?
 
Last edited:

PaintTinJr

Member
But that's not true is it? Shouldn't part of DLSS be reducing the render resolution to create more (real) frames? Unless you want to argue that at 1080p and 90fps the latency is the same as 2160p and 30fps?
Their own slide quotes the latency, more real frames or not it counts for nought if you can't use that motion fidelity because of lag.
 
Last edited:

Minsc

Gold Member
Their own slide quotes the latency, more real frames or not it counts for nought if you can't use that motion fidelity because of lag.

All I can find on their site is about Reflex 2:

Multiply performance by up to 8X using DLSS 4 with Multi Frame Generation, reduce PC latency by up to 75% with Reflex 2, and experience next-generation RTX Neural Rendering.

Does that not get factored in? I'm still having a hard to determining whether you're saying using DLSS to reduce the render resolution and actually generate real frames doesn't also improve latency?

But it sounds like Reflex 2 works in combination with DLSS 4 to reduce latency - so it shouldn't be the same?
 

Goalus

Member
Should hardware-based geometry transformation, rasterization, texturing and anti-aliasing be considered legitimate when playing a game?

Or should players be forced to turn it off and instead make use of a software-only implementation of the entire rendering pipeline like it was common in the early nineties?
 
Last edited:

cormack12

Gold Member
All I can find on their site is about Reflex 2:

Multiply performance by up to 8X using DLSS 4 with Multi Frame Generation, reduce PC latency by up to 75% with Reflex 2, and experience next-generation RTX Neural Rendering.

Does that not get factored in? I'm still having a hard to determining whether you're saying using DLSS to reduce the render resolution and actually generate real frames doesn't also improve latency?

But it sounds like Reflex 2 works in combination with DLSS 4 to reduce latency - so it shouldn't be the same?

Yeah, as I understand it the reflex engine works to resolve the latency introduced by frame generation. It's also not complete frame generation, they talk about moving pixels based on their predictive engine.

I still don't think it should be counted though.
 

Outlier

Member
If it reduces playability or visual consistency, then NO.
I won't waste anymore money on these Software Driven GPUs.
I can't even stand using FSR. I can see the how it makes games look worse.

Native performance is KING!

The Wonder Years GIF by ABC Network
 
Last edited:

PaintTinJr

Member
All I can find on their site is about Reflex 2:

Multiply performance by up to 8X using DLSS 4 with Multi Frame Generation, reduce PC latency by up to 75% with Reflex 2, and experience next-generation RTX Neural Rendering.

Does that not get factored in? I'm still having a hard to determining whether you're saying using DLSS to reduce the render resolution and actually generate real frames doesn't also improve latency?

But it sounds like Reflex 2 works in combination with DLSS 4 to reduce latency - so it shouldn't be the same?
It was a black slide in one of the other threads, not sure if it was possibly a screen grab from their official video, but it definitely quoted the latency of MFG at 33ms for that 240fps, when just by doing the calculation the latency should be x8 smaller at 4ms, meaning you are eating 7 extra frames of latency than you should per motion feedback.

edit:

It is in the DLSS4 Cyberpunk video, where you see the PC latency figure listed at 36ms with 245fps, and 33ms for one to left, etc.
 
Last edited:

MayauMiao

Member
If fake frames can make my game look silky smooth I don't even care, I play games for fun, not counting how many frames I can get.
 
For performance measurements? No

When playing, it does make the games look more smooth, so it's better than old school motion blur (30 fps games).

If your frame rate is already 120+ fps, generating these extra frames will not cause much trouble with input lag and may feel smoother (I don't have access to a 240+ hz monitor to try it).

I'll give an example, I enabled dlss and frame generation on ... It was pretty smooth (100+ fps) but some bosses require tight timing and I could feel the lag still. So I lowered the details (disabled ray tracing), disabled frame gen and went along... It's just more enjoyable in the end.
 
Last edited:
All I can find on their site is about Reflex 2:

Multiply performance by up to 8X using DLSS 4 with Multi Frame Generation, reduce PC latency by up to 75% with Reflex 2, and experience next-generation RTX Neural Rendering.

Does that not get factored in? I'm still having a hard to determining whether you're saying using DLSS to reduce the render resolution and actually generate real frames doesn't also improve latency?

But it sounds like Reflex 2 works in combination with DLSS 4 to reduce latency - so it shouldn't be the same?

DLSS (rendering the game at a lower resolution and then upscaling it with an algorithm) usually improves latency.

This is because DLSS has overhead of its own (increases latency) but it’s offset by the increase in framerate (which lowers latency).

You can compare this by looking at latency at 1080p native, vs 4K DLSS Performance (which is 1080p upscaled to 4K). 1080p native without any DLSS will have lower latency.

Frame gen is a completely different thing. Frame gen is about interpolating in between real frames. It is not frames generated by the game’s engine.

If you can generate 29 fps to 240 fps using frame gen alone… it might look “smooth” but you’re going to be asking why it plays like crap because it will feel incredibly clunky.

There is no free lunch.

I will never play games using it, total trash.
 
Last edited:

Kataploom

Gold Member
Wtf what??

I have shitty cpu with rtx 3xxx series and with DLSS i can get around 65-70fps and without dlss around 30-35.. its a huge difference for me,while looking almost the same
Do you not get judder when moving the camera? Haven't used Nvidia FG so can't talk about it, just about my experience with AFMF and LSFG.
 
Top Bottom