• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Are "Generational" leaps become a thing in the past? AI upscaling and Frame generation is the future

Gamer79

Predicts the worst decade for Sony starting 2022
Growing up, Each game generation came with massive leaps. Going from the nes to the snes was massive. Going from the SNES to the N64 was massive. Going from the N64 to the Gamecube was massive. Going from the Gamecube to the Xbox 360 was massive. Going from the Xbox 360 to PS4 was pretty big but not so massive. Going from PS4 to PS5 is a big hop vs a giant leap.

The PC side sees the same trend: Going from the GTX to the RTX 2000 series was a massive leap. The rtx 2000 series to the 3000 series was a big big jump, Going from the 3000 series to the 4000 series was a big skip, and now the 4000 series to the 5000 series is kinda a hobble with a crutch to make it look better than it is.

Looks like Moore's Law (look it up if you don't know what that is) is dead. It seems like the days of just getting more Horsepower to outdo the previous generation is over. Especially when it comes to GPU's. If they keep getting bigger they are going to require a server tower to store them in and a power supply that comes with it's own power station to run them. Also the MSRP on the Big Dawg of GPU's is getting insane. $2000 MSRP?

Looks like the furture is going to be in Ai Upscaling and Frame Generation. There is only so much power you can pack into a small space so if we are going to advnace it has to be another way.
 
The problem is budgets and development time.

You want open worlds featuring perfect photorealistic looking rocks and characters featuring ray traced individual nostril hairs, you gotta pay for it.

I am wondering what a PS6 could offer to get me to upgrade?

8K? No thanks

120fps? No thanks

Ray/path tracing easily attainable in every game so that devs can finally abandon baked lighting time wastes? Yes OK
 
Last edited:

Big Baller

Al Pachinko, Konami President
arnold schwarzenegger smile GIF
 

HogIsland

Member
Individual games contain 1-2 generational leaps within their code. Playing Cyberpunk on a Steam Deck vs the highest end PC is at least 2 leaps. You get the generation you want / can afford.

I don't think it's right to compare GPU series that come out every 2 years on average with console generations that are 7 years. If you only upgrade your PC every 7 years, you will appreciate a big leap every time.
 
Last edited:
Call me loony but I think something happened when they introduced ray tracing. Up until that point better hardware went hand in hand with huge leaps. Suddenly Nvidia came around the corner with this, lets call it, effect that demanded an INSANE amount of hardware for no obvious reason and ever since hardware demands have went up through the roof without the games even looking that much better.... and every time you raise an eyebrow it's like... but.... but... but... RAY TRACING!!!!

Biggest fluke in gaming technology! There, I said it... Come at me!
 
Last edited:

Hookshot

Member
Let's wait to see what Switch 2 can do. If AI is a magic fix for that then I guess so and a whole heap of other companies will start making cheaper handhelds that can run the newer games and generations go away until some other technological leap happens
 

Gambit2483

Member
We hit the bar for visually diminishing returns about a generation ago.

Can/will games look better? Yes.

Are there going to be anymore giant visual leaps like we experienced with 2D->3D or 3D->HD? No.

And yes, A.I. Upscaling a.k.a Machine learning seems to be the next new industry visual (and performance) technical innovation/gimmick
 

Parazels

Member
What do we realistically expect from PS6?
The current gen games at higher resolution, more stable frame rate and high settings, that's it.
 
Last edited:

Robb

Gold Member
I feel that kind of died alongside the introduction of “Pro” consoles that bridge the gap and Nintendo just going off the rails completely.

PS3 to PS4 is, and will probably remain, the last one I’d consider major.
 
Last edited:

Fafalada

Fafracer forever
What do we realistically expect from PS6?
The current gen games at higher resolution, more stable frame rate and high settings, that's it.
We're still stuck on 25 year old interaction model/simulation etc.
While it might not be the type of leap that would jump at you from the first frame of a (gameplay)trailer like PS1->PS2 was, that's where the possibilities for biggest gains exist.

Basically moving away from 95% static worlds, characters that look like CG puppets on strings everytime you leave a cutscene, interactions where most of the character parts phase through everything (and each other) and physics simulation almost entirely restricted to destruction and particle showers.

To an extent - the push for AI compute can actually help enable a lot (instead of perpetually obsessing over improving pixels as the only practical use case). The caveat being that someone (preferably a studio with enough pedigree/fame to be noticed) needs to test the limits of what's possible and garner positive market response first - before the rest of the industry will follow.
 

Lokaum D+

Member
lol, fuck this "Generational leaps" i remember when leaps were made in the same generation.

FF7 > FF8
Uncharted > Uncharted 2 > 3 > The last of us
Dead Space 1 > 2 > 3
Bioshock 1 > Infinite
Farcry 1 > 2 > 3
Gow 1 > 2
Underground 1 > 2

Every E3 was something else, seeing how developers would made the game better using the same hardware.
 
Last edited:

peish

Member
Yes, that is why Jensen, the ever-visionary, is moving into AI fake transformers training etc.

Look at this, we are looking at either or, performance gain or lower power consumption, 15-30% if you are lucky.

There is no more space like the last big jump from Samsung 8nm to Tsmc 4nm with RTX 4000

OkvVaD2.png
 
Last edited:

NeoIkaruGAF

Gold Member
Kinda.
A generational leap meant you’d get games that were completely impossible to make / run on machines from the previous generation.
You could have Sonic the Hedgehog on the Sega Master System, yeah, but it wasn’t Genesis Sonic the Hedgehog.
Today, The Witcher 3 or Doom are the same game on a cutting-edge PC and on a Nintendo Switch. Sure, there can be heavy compromises involved. But the game is the same. The same goes for Cyberpunk 2077 on PC vs eighth-gen consoles.
 

PeteBull

Member
The simple explanation is: tech advances slowed down, and not by lil bit, ps1 to ps2 was crazy jump around 100x performance in 6 years, ps4 to ps5 is around 3x in cpu, around 6x in gpu that took 7 full years, ofc more ram and ssd on top but actual raw performance jump is relatively tiny, same way ps5 to ps6 will be even smaller, likely around 4x only when it comes to gpu performance, so we can be sure 5090 we can get in 2-3days is already stronger from 2028 holidays launched ps6...
 
It is still possible... the problem is that getting a generational leap would be too expensive. Not to mention that we are hitting realistic power consumption limits and high development costs.
 

Knightime_X

Member
The problem is people think today's games are "unoptimized".
The actual truth: things like extreme resolution (4k+), incredibly large quality textures, antialiasing (esp msaa), ray\path tracing are comically demanding.
Then there's a plethora of smaller details one might not even notice (extremely high shadow resolutions), draw distance, mesh quality, xyz etc, that really add up.

One can only optimize so much before they have to cope with the fact that the quality and performance that kind of gamer is expecting from modern games at ultra settings is reaching DANGEROUSLY close to render farm territory.

It's hilarious that certain gamers are expecting Toy Story 3 \ Shrek levels of graphics from a single card.
And have the audacity to claim something is "unoptimized" knowing full well of their unrealistic expectations.

Ai technology is here to help achieve, the otherwise unachievable.

Even the most optimized games on the planet will eventually shit the bed in performance if you increase the resolution high enough.
 
Last edited:

HogIsland

Member
The real drop off is creativity. You can have all your remasters with upscaled AI DLSS 10 bullshit, but no one is really making anything fresh these days.

and fresh doesn't directly translate to high compute requirements. given the evolution of game design we have now, there probably could've been a decent dark souls like game for the n64.
 

amigastar

Gold Member
lol, fuck this "Generational leaps" i remember when leaps were made in the same generation.

FF7 > FF8
Uncharted > Uncharted 2 > 3 > The last of us
Dead Space 1 > 2 > 3
Bioshock 1 > Infinite
Farcry 1 > 2 > 3
Gow 1 > 2
Underground 1 > 2

Every E3 was something else, seeing how developers would made the game better using the same hardware.
GTA IV to V was also a substantial improvement in graphics.
 
Last edited:

Fbh

Member
The jump to 60fps being pretty much standard on consoles felt like a pretty big leap to me. The few games that manage to present a decent graphical bump over Ps4 while also delivering 60fps (like Forbidden West, Demon Souls Remake, Cyberpunk and a few more) definitely give me that next gen vibe.
Gameplay wise there hasn't been something that felt like a generational leap since like the Ps3 to be honest.

The death of optimisation.

Why bother when you can just use framegen and AI upscaling?

Frankly these tools should not be used as a crutch.

This is true too.
My expectation for upscaling was "hey, we can render the game at 1440p then upscale it to a pretty passable rendition of 4K".
Instead it's like "hey we can render the game at 680p then upscale it to some blurry as fuck image that we can technically call 1440p on our marketing material, and it'll start smearing all over the place when anything moves".
 
Last edited:

Edder1

Gold Member
Everything to do with budget, time and skilled devs. The increase in dev cost and time it takes games has lead publishers to force longer cross gen period or release games only on current gen but with same tech as last gen. If AI doesn't spead up dev time then we could be in with incremental jump in visuals for a while, until ray tracing/path tracing becomes a standard, because it's one tech that can reduce workloads by considerable amount.
 
Last edited:
I am wondering what a PS6 could offer to get me to upgrade?
We're currently on an architecture designed for multipurpose computing using AMD CPUs and GPUs designed for multipurpose computing.
PS6 could offer a return to bespoke Sony HW that's been designed only to play video games.
GTA IV to V was also a substantial improvement in graphics.
Both games lost the Rockstar aesthetic that for whatever reason lived and died with the PS2 HW.
PS3 to PS4 is, and will probably remain, the last one I’d consider major.
PS5's DualSense is the biggest controller leap since the DualShock 2.
 

Robb

Gold Member
PS5's DualSense is the biggest controller leap since the DualShock 2.
Could be I guess, but that’s not what I take into consideration when talking about generational leaps personally. Wii was basically just a GC with motion controls to me. I enjoyed it a ton, but I’d hardly call the leap “generational” between the two, for example.

If you’d show me Twilight Princess running on both systems I’d probably not be able to tell which is which.
 
Last edited:
The problem is budgets and development time.

You want open worlds featuring perfect photorealistic looking rocks and characters featuring ray traced individual nostril hairs, you gotta pay for it.

I am wondering what a PS6 could offer to get me to upgrade?

8K? No thanks

120fps? No thanks

Ray/path tracing easily attainable in every game so that devs can finally abandon baked lighting time wastes? Yes OK
Pretty much took exactly what I was gonna say word for word here. We've reached the point where tangible technological leaps are going to require even more time and money than what is already spent on these games, and development times plus the price of games has become a point of contention. Granted, I suspect the major companies 'could' tighten up their budgets a tad but probably not by much. Personally, I'd rather see consistent resolution/framerates and better A.I take a leap forward.
 
We are getting pretty close to the end of raw performance gains. Every few years a new CPU and GPU come out that require a ridiculous amount of additional power and only give us marginal gains. AI upscaling/FrameGen will be the future like it or not.

I doubt we will ever see large generational leaps. Games are very close to looking realistic and the human eye can only percieve so much.
 
Last edited:
Instead it's like "hey we can render the game at 680p then upscale it to some blurry as fuck image that we can technically call 1440p on our marketing material, and it'll start smearing all over the place when anything moves".
Sony could make an ultra-low latency display that works flawlessly at a consistent 60fps/680p and call it "Standardized Gaming Resolution"(SGR).
People use a PS with SGR Sony display and everything works plug-and-play just like Mac HW.
On the flip side Sony could sell huge 4k TVs with built in PSSR HW for users who just want a massive image, 100% plug-and-play, no settings to mess with.
This way people who care about maximum performance/aesthetics get the 680p with smaller size options and people who want 4k get the biggest possible size options.
When DF or some other 'PC-spec focused reviewer' reviews the PS6 they'll naturally focus on the more impressive specs and superior aesthetics of the 680p configuration with plug-in DualSense.
Instead of choosing between performance and looks end users would choose between Sony displays.
People would have a 4k PS setup in the living room with wireless DualSense and a 680p setup in their bedroom or at their desk with plug-in DualSense.
Nintendo could configure Switch 2 to work with both the "Standardized Gaming Resolution" displays and the 4k PSSR displays.
4k PSSR displays could upscale everything from streaming content and DVDs to retro consoles.
 

A.Romero

Member
Yes, that is why Jensen, the ever-visionary, is moving into AI fake transformers training etc.

Look at this, we are looking at either or, performance gain or lower power consumption, 15-30% if you are lucky.

There is no more space like the last big jump from Samsung 8nm to Tsmc 4nm with RTX 4000

OkvVaD2.png

Jensen and Nvidia are going towards AI because most of their income comes from that tech.

They are a business and will pursue the highest revenue possible. There is no way that their gaming division could produce 18 bn in one quarter (like their datacenter did Q4 2024). However, you are right regarding current tech limit.

On topic: It's going to be a while before we see a jump like we used to. Diminishing returns and all of that.
 

simpatico

Member
I mean isn't the only appreciable jump from where we are photorealism at 60fps for $500. Unreal 5 is like a glimmer of hope for something like photorealism, but we clearly don't have the processing power to do it at resolutions we want to play at. UE6 and Nvidia 8000 series at the latest? Assuming they stop spoon-feeding and withholding bins of course.
 
Last edited:
Top Bottom