• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

DanielG165

Member
Very curious to see how Indy performs on Xbox. I know that it’s using IDTech, and both modern DOOM games are incredibly well optimized, but Indy is an entirely different beast with a much larger scope. And, that PC spec sheet is nuts; it has to be pushing path tracing at the highest level in that case. Hopefully, we can see a 60 fps mode for Series X, as my 2080S will likely struggle due to lack of VRAM.

I’m also still one of the few here that genuinely believes that Indy looks incredible across the board. Detail, fidelity, lighting, faces, all of it looks fantastic. Looking forward to next week.
 

Turk1993

GAFs #1 source for car graphic comparisons
SlimySnake SlimySnake


75d4c6ca-50f2-4800-998f-427dd1822997_text.gif
 

SlimySnake

Flashless at the Golden Globes
If we talking next gen consoles and pc version turned to the max- 4sure, but if we gonna get last gen(current gen) versions, kinda like with cp2077, expect that shit to run and look nasty af, u know, return of first titanfall resolutions, aka 792p, or even below that :p
Nah. Their latest interview said they wont make that same mistake again and are including consoles early in the development cycle. Besides, Epic has effectively doubled performance of Lumen and Nanite since launch. And most UE5 games are already 1440p TSR'd to 4k at 30 fps. It's the 60 fps modes that keep dropping because they use dynamic resolution even though they are CPU bound in all those UE5.1 games. On PC, I had no issues running any of these games at 4k dlss quality-balanced at 60 fps with some minor adjustments to settings because the CPU was not the bottleneck on my build.

Also, Witcher 4 will be foliage heavy and nanite applies to foliage now. The performance modes might still struggle in some towns like Novigrad which are CPU heavy but i think the 30 fps versions will run at a high resolution and look great on consoles.

Even Cyberpunk runs at 1080p 60 fps on consoles. And 1440p with RT shadows reconstructed to 4k at 30 fps. That's pretty damn decent for $500 machines for a big game like that.

They will do what Black Myth did. Software Lumen on consoles at 4k TSR quality. PC will get Path Tracing thanks to Nvidia. But thanks to better UE5 performance in the latest builds, you will get a decent 1080p 60 fps mode unlike Black Myth which needs to use frame generation to hit 60 fps on consoles because UE5.1 was single threaded as fuck.
 

PeteBull

Member
Nah. Their latest interview said they wont make that same mistake again and are including consoles early in the development cycle. Besides, Epic has effectively doubled performance of Lumen and Nanite since launch. And most UE5 games are already 1440p TSR'd to 4k at 30 fps. It's the 60 fps modes that keep dropping because they use dynamic resolution even though they are CPU bound in all those UE5.1 games. On PC, I had no issues running any of these games at 4k dlss quality-balanced at 60 fps with some minor adjustments to settings because the CPU was not the bottleneck on my build.

Also, Witcher 4 will be foliage heavy and nanite applies to foliage now. The performance modes might still struggle in some towns like Novigrad which are CPU heavy but i think the 30 fps versions will run at a high resolution and look great on consoles.

Even Cyberpunk runs at 1080p 60 fps on consoles. And 1440p with RT shadows reconstructed to 4k at 30 fps. That's pretty damn decent for $500 machines for a big game like that.

They will do what Black Myth did. Software Lumen on consoles at 4k TSR quality. PC will get Path Tracing thanks to Nvidia. But thanks to better UE5 performance in the latest builds, you will get a decent 1080p 60 fps mode unlike Black Myth which needs to use frame generation to hit 60 fps on consoles because UE5.1 was single threaded as fuck.
I will stay neutral or even pessimistic for now, some time passed but we all remember what cdpr trailers can look like and how different they actually are from launch version of the game, especially on console, to not look far, w3 trailer, notice the "ingame footage" right at the beginning and xbox one/ps4/pc at the end of it

vs actual launch trailer:

Ofc by no means it looked bad at launch, even console versions, but even maxed pc version didnt look anywere close to that previous bullshot trailers, and i say it as a witcher series fanatic who beat all quests/found everything in w3 and both of its expension, took solid 200h btw :p

So far we didnt even get w4 cgi/reveal trailer, lets wait for that, maybe to game awards, maybe to 2025 or even 2026, then lets watch few gameplay trailers to set up our expectactions properly, we got lied to so many times over decades of gaming, lets stay realistic, for now :)

Btw, slimy, u are one of very few ppl here on gaf who can even see the difference between bullshot(cgi) and real gameplay trailers, i remember many gaffers were asking if this is ingame footage or cgi once that one dropped, they couldnt even tell, no jokes :)

I vaguelly still remember actual dev from the studio who made that cgi answering, that is ofc cgi, after so many ppl asked about it and ofc in hindsight this shit was crazy obvious, or should have been:messenger_sunglasses:
 

PeteBull

Member
I just came across this
how do people think this is a fair comparison?



also, do we still think that this is as impressive as its reveal? genuine question

About new Crimson Desert gameplay- its not crazy impressive anymore to me, it doesnt look bad but my standards keep going up.
I saw those AI vids of touched up games, almost nothing i see on current gen is crazy impressive anymore, exceptions being:
1) maxed out pc footage of hellblade2, console version is simply too blurry coz of low res, game is basically current gen the order1886, crazy graphics but not much of a game in it
2) gta6 reveal trailer but cant tell for sure how much of its quality will be left in actual game
3) matrix demo, but again, its not actual game, but vertical slice without many of the systems actual game would have, and it still runs terribly
 
I expect w4 to absolutely destroy this thing that is barely superior to wukong on pc.

W4 is probably 5 years away.
When we take into account that the game only entered full production last week and that depending on the projects it could last between two or three years, it wouldn't really be surprising if the next The Witcher could be a cross-gen game but it's not 5 years aways.
 

DanielG165

Member
Witcher 4 will almost certainly be cross gen at this point, imo. It’ll be curious to see the scalability of the game across the board, what features the current consoles will get, and what they won’t get compared to the PS6, next Xbox, and PC.
 

SlimySnake

Flashless at the Golden Globes
I will stay neutral or even pessimistic for now, some time passed but we all remember what cdpr trailers can look like and how different they actually are from launch version of the game, especially on console, to not look far, w3 trailer, notice the "ingame footage" right at the beginning and xbox one/ps4/pc at the end of it

vs actual launch trailer:

Ofc by no means it looked bad at launch, even console versions, but even maxed pc version didnt look anywere close to that previous bullshot trailers, and i say it as a witcher series fanatic who beat all quests/found everything in w3 and both of its expension, took solid 200h btw :p

So far we didnt even get w4 cgi/reveal trailer, lets wait for that, maybe to game awards, maybe to 2025 or even 2026, then lets watch few gameplay trailers to set up our expectactions properly, we got lied to so many times over decades of gaming, lets stay realistic, for now :)

Btw, slimy, u are one of very few ppl here on gaf who can even see the difference between bullshot(cgi) and real gameplay trailers, i remember many gaffers were asking if this is ingame footage or cgi once that one dropped, they couldnt even tell, no jokes :)

I vaguelly still remember actual dev from the studio who made that cgi answering, that is ofc cgi, after so many ppl asked about it and ofc in hindsight this shit was crazy obvious, or should have been:messenger_sunglasses:

I truly despise and detest the Witcher 3 downgrade, and I had zero faith in Cyberpunk to look as good as the demo but they proved me wrong. I still remember when they said the E3 2018 behind the scenes demo was running at 1080p 30 fps on a 1080 ti. I was like this will get downgraded HARD. But they didnt. Instead they downported the game to the PS4 and X1 with arguably the worst graphics ive ever seen, and I couldnt be happier about it. It was a next gen game that looked and ran great on next gen consoles and PC (After a few patches).

So yeah, they fucked up with the Witcher 3 but at least they admitted to DF that it was due to poor console specs leading them to change their entire renderer. So they were somewhat honest about it before launch. But they redeemed themselves by not downgrading cyberpunk, and if anything improved the visuals with ray tracing. The NPC stuff was downgraded even on PC and next gen consoles, yes, but the visuals themselves didnt.

Ux7gs1e.gif


I think everyone should be skeptic given the recent history of downgrades, but i personally have high hopes for UE5 after seeing what smaller studios like game science, bloober and Ninja theory have done with UE5.1 which was terribly optimized. Larger studios like CD Project, Crystal Dynamics, Coalition Studios, and Striking Distance Studios will likely achieve Matrix or Marvel 1943 caliber graphics if they are developing on UE5.4 or 5.5.

I also took an L with Rockstar. I doubted them after the leaks, and rockstar showed me that real devs with ambition wont settle for less. And CD Project recently talked about retaining that ambition.
 
Just bought Indiana jones, the reviews convinced me. Videos of the game are so confusing, some aspects of it looks terrible to me and some of it looks really good?
 

Jack Howitzer

Neo Member
https://www.dsogaming.com/pc-perfor...at-circle-benchmarks-pc-performance-analysis/

Well according to DSO the game still suffers from the good old lod issues: "The only downside is the awful pop-ins that occur right in front of view. I don’t know why MachineGames did not provide higher LOD options for the Supreme settings. It’s a shame really because in the big outdoor environments, the constant pop-in can become really annoying." Turns out this is the root one can't escape from,right?The number one scenery disaster still have to stay thrive forever after all these years,for a linenar first person game,really?
 
I guess this is the best you can achieve with a performance focused engine like the id one



yFS5211RE.bmp

zNbF3837sfc.bmp

Steve Carell Ew GIF by Focus Features


Is pathtracing already out for the game? could give it some improvement
no amount of lighting improvements can save that texture mud. But yeah, maybe the scenes wouldn`t look as flat anymore as the AO f.e. looks nonexistent in some. In motion the game seriously has me hard pressed to find things that would make me go "yes this clearly isn`t a ps4 era game"
 
Last edited:

GymWolf

Member
Steve Carell Ew GIF by Focus Features


no amount of lighting improvements can save that texture mud. But yeah, maybe the scenes wouldn`t look as flat anymore as the AO f.e. looks nonexistent in some scenes . In motion the game seriously has me hard pressed to find things that would make me go "yes this isn`t a ps4 era game"

Steve Carell Ew GIF by Focus Features


no amount of lighting improvements can save that texture mud. But yeah, maybe the scenes wouldn`t look as flat anymore as the AO f.e. looks nonexistent in some scenes . In motion the game seriously has me hard pressed to find things that would make me go "yes this isn`t a ps4 era game"
Was my rdr2 comparison out of line?

Maybe indy has better textures but overall it look kinda worse.
 

Msamy

Member

Big chance ND new game or Sony santa monica new game or both will shown there and definitely ds2 kojima won't miss last tga before ds2 release
 
Last edited:
About new Crimson Desert gameplay- its not crazy impressive anymore to me, it doesnt look bad but my standards keep going up.
I saw those AI vids of touched up games, almost nothing i see on current gen is crazy impressive anymore, exceptions being:
1) maxed out pc footage of hellblade2, console version is simply too blurry coz of low res, game is basically current gen the order1886, crazy graphics but not much of a game in it
2) gta6 reveal trailer but cant tell for sure how much of its quality will be left in actual game
3) matrix demo, but again, its not actual game, but vertical slice without many of the systems actual game would have, and it still runs terribly
Console version of HB2 looks good and not blurry at all
 

PeteBull

Member
Console version of HB2 looks good and not blurry at all
I could say its subjective, bring arguments etc, but i will simply quote gaf thread about it, with unbiased raw data:

Resolution and Visuals:
- DRS 1296 to 1440p (w/ black bars 964 to 1070p). Series S will be covered in separate video

PC maxed at 4k vs 4k dlss quality(so native 1440p upscaled to 4k via dlss) makes big difference to my eyes even, not to mention much blurrier than that so what xsx can provide us with.


Its ok tho, with lvl of graphical fidelity hellblade2 has given us, compromises on console like much lower resolution and 30fps are understandable, hell even prefered to downgrading the game too much in other aspects.
 

SlimySnake

Flashless at the Golden Globes
Just bought Indiana jones, the reviews convinced me. Videos of the game are so confusing, some aspects of it looks terrible to me and some of it looks really good?
Because they added RTGI but forgot to add the nanite equivalent or use higher fidelity next gen assets.

RTGI makes some of the levels look great, but lighting is only part of the overall look of the game. Lethal and Gymwolf go at it every few months. What's more important? Lighting or asset quality? Well, it turns out both.

It is ridiculous that the game is running at 1800p 60 fps. That means 5.6 million pixels being rendered twice. Thats 25% more than the pixel budget of a native 4k 30 fps game. We were pissed at Sony's devs targeting native 4k 30 fps but here we have these geniuses wasting precious GPU resources needlessly on high resolutions like 1800p.

I could even understand 1440p 60 fps like demon souls because that wouldve freed up 50% of the GPU to push other visual effects like better geometry, and materials.

This is why I am a big fan of UE5. You get not just great next gen lighting but also great next gen materials. No more jackyl and hyde looking games. They simply look far more consistent.

TLDR; price of 60 fps and high image quality. I am sure the 60 fps brigade and UE5 haters will love the graphics in the game. Or at least pretend to love it while they jerk off to UE5 games in secret.
 

SlimySnake

Flashless at the Golden Globes
In a perfect world every dude should have a 4090 and a 14900k so they could stop trashing ue5 because their shit pc or weak console can't run it decently...
The consoles can easily do 1440p 30 fps on pretty much all UE5 games. Black Myth, Silent Hill 2, Robocop all run at 1440p 30 fps reconstructed to 4k using TSR or FSR.

The problems only appear when they have to run at 60 fps which should have never been available on consoles. Devs have no one but themselves to blame.
Yes idtech engine run much better, it also look much worse, no fucking shit.
It's remarkable how many people dont realize this. Even on a hardcore gaming forum. Better graphics require more graphics power. Basic common sense especially for people who are used to buying better consoles every 6-7 years. But we are in an age of retardedness where common sense no longer exists and people want 4k 60 fps with next gen graphics.
 

Edder1

Member
Hot damn. Yeah, this is gonna be a showcase title.
Indiana Jones has impressive lighting, but that's about as impressive as it gets. Geometry is last gen, asset quality is last gen, animations and character models are last gen (some animations are very jankey and AA), character skin shading is very much last gen.

Really only thing that makes the game look impressive in parts is its lighting, but even that isn't always consistent. Overall it's another Frankenstein of a game like what has been the norm this gen.

I think 2025 will finally be the year where technologically we start seeing games take advantage of new tech. There probably will be a few more Frankenstein titles, but overall I expect games to finally embrace a generation that's moving into its 5th year. 2024 has been dog sh*t in that regard. Onwards and upwards, lol.
 
Last edited:
Because they added RTGI but forgot to add the nanite equivalent or use higher fidelity next gen assets.

RTGI makes some of the levels look great, but lighting is only part of the overall look of the game. Lethal and Gymwolf go at it every few months. What's more important? Lighting or asset quality? Well, it turns out both.

It is ridiculous that the game is running at 1800p 60 fps. That means 5.6 million pixels being rendered twice. Thats 25% more than the pixel budget of a native 4k 30 fps game. We were pissed at Sony's devs targeting native 4k 30 fps but here we have these geniuses wasting precious GPU resources needlessly on high resolutions like 1800p.

I could even understand 1440p 60 fps like demon souls because that wouldve freed up 50% of the GPU to push other visual effects like better geometry, and materials.

This is why I am a big fan of UE5. You get not just great next gen lighting but also great next gen materials. No more jackyl and hyde looking games. They simply look far more consistent.

TLDR; price of 60 fps and high image quality. I am sure the 60 fps brigade and UE5 haters will love the graphics in the game. Or at least pretend to love it while they jerk off to UE5 games in secret.
Yeah I’m playing it now - there are some assets that are actually really great looking like statues and certain objects and others that are not - animations are super last gen, the lighting is great - there are aggressive LOD’s all over the place, was looking for a fidelity mode and it does not exist
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I think 2025 will finally be the year where technologically we start seeing games take advantage of new tech. There probably will be a few more Frankenstein titles, but overall I expect games to finally embrace a generation that's moving into its 5th year (smh). 2024 has been dog sh*t in that regard. Onwards and upwards, lol.
Yep.

Kingdom Come 2 and Ghost of Tsushima 2 fit the bill.

Lets hope Death Stranding 2 avoids this fate.
 

SlimySnake

Flashless at the Golden Globes
This is why I cant stand DF at times. Look at the timestamped footage here. This is his closing argument and he uses the ugliest looking level ive seen in a game in 10 years.



He didnt spend anytime comparing this game to other games released this year. He barely mentions anything about the asset quality and visual fidelity of the levels outside of RTGI. He got hung up on the custscene stutter and wasted two sections talking about audio. Who gives a shit about audio this much.

It's a tech review. Review the tech. Use comparisons against other games to judge said tech. Point out how low res some of these levels look even when compared to last gen games. He mentions the fact that the RTGI in the PC version of the game, even in its non-path tracing form, looks way better than the xsx version, but only as a footnote all the way at the end of the video. Why? Arent you supposed to be discussing the tech?
 

Alex11

Member
RTGI makes some of the levels look great, but lighting is only part of the overall look of the game. Lethal and Gymwolf go at it every few months. What's more important? Lighting or asset quality? Well, it turns out both.
I would somewhat agree that both, if we are taking things simple, but I would argue that lighting takes the lead a bit.

What I mean is, you can throw all the 8K textures and highly polished assets all you want, if the lighting is flat it will look poor.
But great lighting, especially path tracing can make even the simplest asset or scene look amazing, you can have shadows, soft shadows, multiple realtime shadows, light bounces, never mind the color choices and grading, which can be greatly influenced by the lighting.

Many games that hold up graphically are either stylized or have amazing lighting or both.
 

SlimySnake

Flashless at the Golden Globes
I would somewhat agree that both, if we are taking things simple, but I would argue that lighting takes the lead a bit.

What I mean is, you can throw all the 8K textures and highly polished assets all you want, if the lighting is flat it will look poor.
But great lighting, especially path tracing can make even the simplest asset or scene look amazing, you can have shadows, soft shadows, multiple realtime shadows, light bounces, never mind the color choices and grading, which can be greatly influenced by the lighting.

Many games that hold up graphically are either stylized or have amazing lighting or both.
Yeah, ive seen this in action in Dragons Dogma 2's PT mod. It literally changes the whole look of some of the doors and bricks bringing out a lot more details from materials. And that game uses RTGI.

This DF video was done on Xbox and apparently it uses lower than Low quality RTGI compared to the PC version which doesnt even have the Path Tracing patch yet. So maybe the PC version looks way better than what im seeing in DF's video.
 
This is why I cant stand DF at times. Look at the timestamped footage here. This is his closing argument and he uses the ugliest looking level ive seen in a game in 10 years.



He didnt spend anytime comparing this game to other games released this year. He barely mentions anything about the asset quality and visual fidelity of the levels outside of RTGI. He got hung up on the custscene stutter and wasted two sections talking about audio. Who gives a shit about audio this much.

It's a tech review. Review the tech. Use comparisons against other games to judge said tech. Point out how low res some of these levels look even when compared to last gen games. He mentions the fact that the RTGI in the PC version of the game, even in its non-path tracing form, looks way better than the xsx version, but only as a footnote all the way at the end of the video. Why? Arent you supposed to be discussing the tech?

Yeah I don’t get this review it should compare to other games doing similar things. I will say there are very high quality objects in the environment that probably would not be possible on last gen hardware, the RTGI is very low quality tho on series x running at 60fps, it still makes it look better but it’s not anything crazy
 

SlimySnake

Flashless at the Golden Globes


AMAZING!

The dick riding for such a mediocre looking game just because its 60 fps is so fucking stupid. We deserve 4 years of cross gen trash if we praise stuff like this.

Also, consoles getting lower than low quality lighting is also being celebrated in the name of 'optimization'. nvm the fact that the optimization is simply code for bad graphics. If the game looked like the Matrix demo, and ran at 60 fps then i'd be like, 'Yes! Great Optimization!' Anyone can make a last gen looking game at 60 fps. Hell, they did for the first three years of this generation. CoD is still doing it 4 years later.

MvH4aNf.jpeg



DF and gamers both are responsible for the state of graphics the last four years by praising devs who do the bare minimum and criticize the ones actually trying to push the fidelity. The funny thing is that they both bitch about how graphics have stagnated and they cant tell the difference between last gen games and current gen games.
 
Last edited:

rofif

Can’t Git Gud


AMAZING!

The dick riding for such a mediocre looking game just because its 60 fps is so fucking stupid. We deserve 4 years of cross gen trash if we praise stuff like this.

Also, consoles getting lower than low quality lighting is also being celebrated in the name of 'optimization'. nvm the fact that the optimization is simply code for bad graphics. If the game looked like the Matrix demo, and ran at 60 fps then i'd be like, 'Yes! Great Optimization!' Anyone can make a last gen looking game at 60 fps. Hell, they did for the first three years of this generation. CoD is still doing it 4 years later.

MvH4aNf.jpeg

this gi lacks a pass. They are in a jungle in daylight. I understand it's shadowed but it's pitch black.
just no
 
Indian Jones’s RTGI is crazy low quality, I wish since it’s a cinematic game you could do 30 with a higher quality RTGI, if you watch the DF video they have an example of the problem, you can see the shadows pop in only a few feet away from an object, they’re like super blurry otherwise - maybe the consoles just aren’t powerful enough.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Indian Jones’s RTGI is crazy low quality, I wish since it’s a cinematic game you could do 30 with a higher quality RTGI, if you watch the DF video they have an example of the problem, you can see the shadows pop in only a few feet away from an object, maybe the consoles just aren’t powerful enough.
the game is averaging 1800p 60 fps. no one asked them to target a resolution that high. Simply dropping to 50% wouldve let them fix the shadows AND LODs.

Or you know be normal, make this third person and 30 fps, and maybe it will look like this

BgFJ3fQ.gif


U7rTTJV.gif



instead of this
x6640BJM.bmp
 

BlownUpRich

Neo Member
I guess this is the best you can achieve with a performance focused engine like the id one



yFS5211RE.bmp

zNbF3837sfc.bmp



Not even joking, does rdr2 on pc looks better than this? :lollipop_grinning_sweat:
Been reading posts & tweets of people convincing themselves by making fun of how if this was a UE5 game it would've looked "a bit better but performs twice as worse" & I'm laughing so hard at the irony, I mean don't better looking graphics require more hardware power? Yeah you can run this over 120 FPS on your 4070s or your 4060 TIs but it looks like a PS4 game at best with some of the "current-gen" niceties like better draw distance, higher resolutions & Ray Traced GI (only big thing in here), but that's it, I can bring better looking last-gen games than this in most aspects.

An Unreal 5 game would look a generation above this, so it's more expensive on resources & that's only normal.
 
Top Bottom