• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Will AI and powerful GPU’s lead to more sophisticated gameplay?

Ritsumei2020

Report me for console warring
Ironically some of then most creative gameplay stuff in recent memory like the physics and chemistry in BOTW and the building in TOTK were done on ancient hardware.

What I am asking is, will advances in AI and GPU power will make it easier to develop similar stuff in games, or even make it easier to develop better enemy AI ect.

Or is it just limited to better resolution, framerate and lighting?
 

Yoda

Member
next few years: probably greatly lower the cost of asset creation -> bar to become a "pretty" game is much lower.

gameplay? I'd guess more intelligent AIs for NPCs + native speaking to an entity vs. a list of pre-selected options. Imagine pokemon where you could yell orders that aren't fixed moved/attacks? Would feel like you're in the anime :)
 

kevboard

Member
no, because AAA game design is inherently risk averse...

we have hardware that can essentially do every game design concept you can throw at it without issue.
in order to actually get to the limits of what is possible on current gen consoles or a high end PC you'd need to come up with a truly ridiculous concept, at which point you'll probably run into issues of development feasibility.


making ML based enemy AIs could be an idea for bots in shooters or fighting games, but in most generes you don't actually want intelligent enemies, you want interesting yet satisfying to fight enemy behaviour, which will always need a good game designer to fine-tune by hand.
 

Loomy

Thinks Microaggressions are Real
Sophisticated? Yes. Creative? Probably not as a direct result of more graphical processing power.

The most creative/interesting gameplay typically start out in indie games. AAA development has become boringly risk averse and annoyingly predictable, so they'll adopt interesting things from smaller games and build on it. That will take time.

That's why it's important to support good and interesting indie games, and give them a try even if they're not what you would typically play. Your $40 supporting a game made by an individual or small passionate team will go a lot further than paying $80 to EA for the next Mass Effect that Bioware is making because that's apparently the only thing they still know how to make(we hope).
 

intbal

Member
No.

AI and greater hardware power will continue churning out the same old trash that won't be anywhere near as "sophisticated" as this quarter-century old game:

xNevn5W.jpeg
 

nkarafo

Member
AI will be used to cover for bad optimization, like it does now for the most part. It will just get worse after this tech becomes the standard for everyone.

Also, you don't need all this for gameplay/npc AI improvements. There are much older games that used to run on decades old hardware and consoles that have more sophisticated AI/Gameplay compared to most modern games. This is 100% a dev skill issue, not hardware.
 

Ritsumei2020

Report me for console warring
AI will be used to cover for bad optimization, like it does now for the most part. It will just get worse after this tech becomes the standard for everyone.

Also, you don't need all this for gameplay/npc AI improvements. There are much older games that used to run on decades old hardware and consoles that have more sophisticated AI/Gameplay compared to most modern games. This is 100% a dev skill issue, not hardware.

How is AI used to cover for bad optimization? Genuine question.

Edit: Do you mean framerate?
 
Last edited:

nkarafo

Member
How is AI used to cover for bad optimization? Genuine question.
AI gives breathing room for better performance by upscaling a lower native resolution (using DLSS) and/or adding fake frames.

Since these things are "extras", you would assume they are being used to provide "extra" performance, so if you don't use them (because you don't want the artifacts and blurriness they produce) you just get the "standard" performance and visuals, right?

Well, it started like that, or at least it was marketed that way. However, slowly but surely, developers use that breathing room not for extra performance but to allow for the standard performance you would normally expect without all this tech.

That means some games will run terribly, below the acceptable threshold, if you don't use DLSS/AI upscale. A lot of UE5 engine games today absolutely need this tech so they can run acceptably.

Silent Hill 2 remake is the poster child of this. The developers didn't even bother to optimize their engine so that it can cull out all the visual details that are covered by the fog, which is like the most basic optimization used in games for decades. Instead, the game renders the whole town geometry and textures despite you not being able to see it. Not to worry though! DLSS capable hardware will save us, since the game now can run acceptably even without meeting the minimum standards of optimizing a game!

So instead of using this tech for our benefit (extra performance) it's used for the developers benefit since they can offer you the standard performance but without them having to worry about optimization as much as they had before.

And not only that. Because DLSS is so good, it allows NVIDIA to sell worse hardware for the same price. For instance, they now sell "50" tier cards as "60" tier. That's only possible for them because DLSS can cover for the loss of raw hardware performance so now a gimped "60" card can perform as you would expect a normal 60 tier card would, but ONLY if you use DLSS. Which makes it pretty much a mandatory standard now.

This whole thing was never supposed to be for your benefit is what i'm saying.
 
Last edited:
Current AI using machine learning is just better tools using a fancy name. Machine learning is not intelligent at all. It's basically using the sheer power of training by super computers (brute force) to do the job of decent upscaling that a much less powerful machine will be able to stupidely do.
 
Last edited:
Anytime I see "AI" applied in any scenario, let alone video games, I disregard this argument entirely. Current machine learning requires current game design principles to which it is simply getting dumber each generation. I've yet to see something that replicates F.E.A.R or Killzone 2. Everything is just AI that does x & y randomization systems to frustrate players, nor do I see them fixing the issues plaguing game industry, from bug fixing to more physics system and addressing linear play.
 
Ironically some of then most creative gameplay stuff in recent memory like the physics and chemistry in BOTW and the building in TOTK were done on ancient hardware.

What I am asking is, will advances in AI and GPU power will make it easier to develop similar stuff in games, or even make it easier to develop better enemy AI ect.

Or is it just limited to better resolution, framerate and lighting?


The irony is to believe that a stupid machine can replace human creativity. The more you depend on it, the less you use your brain and so, the dumber you become.

So no, gameplay won't improve an iota. If anything, it would get worse.
 

StereoVsn

Gold Member
Studios will use AI to cut costs from reducing headcount to making more slop with AI with crap code and level design.

Then they won’t optimize that said code and mask it with more AI, hardware this time, using upscaling and frame Gen.

Don’t worry though, they will still push for higher game prices to compensate.
 

RoboFu

One of the green rats
The way society is going it will just lead to an overwhelming amount of dating sims.
 

Fbh

Member
No.
Hardware has never been a barrier for more innovative gameplay. It's the devs/publisher who choose to just focus on better graphics and bigger worlds.
As you said, one of the modern open world games with the most fun and well integrated physics was made on the ancient Switch hardware. One could even argue decades old games like Red Faction Guerilla, FEAR, Half Life 2, etc where doing more to push things like gameplay, physics, AI and interactivity than most modern games.
 

SHAKEZ

Member
Maybe some actually creative devs will use it in innovative ways, most will just make an unoptimized mess and call AI to generate frames for you.
 
Last edited:
No, they'll just reduce the number of artists working in the industry and drive up hardware requirements.

Having AI generated dialogue will also have the (totally unintended, I'm sure) side effect of making even single player games that rely on it unplayable without an internet connection, since it's highly unlikely that any of that stuff will be generated locally. Once the developer or publisher decides they'll move on to a new model that no longer supports older games, that keeping the servers running just isn't worth it anymore, or that they'd prefer to have you pay again for a newer "improved" version you'll be left with a game that's literally unplayable.

It'd essentially solve piracy, too, so there's no way they won't go for it.
 
The main limitation for LLMs for games is the hardware requirements are too high for real time interaction. Putting the compute in cloud isn't a solution any more now than it was back when Xbox originally touted it. Edge solutions need more time to develop, but maybe a small model can be made for some specific limited use cases in games, besides the obvious machine learning stuff like DLSS. The sort of generic AI NPC chat like what Nvidia has showcased is out of the question for AAA games. You can't give players that sort of freeform control over dialogue, it'll break instantly. A more reasonable solution would be to make a model that inserts more generic background barks for NPCs instead of just using one or two prerecorded lines on repeat like games do now. And then build on that to add more reactivity, like add ability for the model to respond to what's happening on screen in a more natural way. These can be done easily already, the problem is that it's resource heavy and not fast enough. There's mods already in games like Skyrim that do this stuff, but you have to wait for like 10 seconds to get a response from the model.
 
Last edited:

Kataploom

Gold Member
Been thinking about it... Wouldn't AI take a lot of power from GPU of used for gameplay? I don't think it's worth it.
 

notseqi

Gold Member
AI doesn't write compelling stories nor can it replace proper quality testing. Configuring an AI to assist creating the game you have in mind will probably eat around the same time you require to make the game itself. It might help with sequels once it's up and running but then we're back to offering compelling games with enough variety to attract a wide enough audience.
 

RCX

Member
new gameplay ideas and mechanics need better creative people and the people behind them being willing to take risks.

Both are currently sorely lacking in the "AAA" section of the industry. If the costs of the tools go down enough then we could see some very cool stuff from small studios or individual creators.
 
Last edited:
Top Bottom