Indiana Jones and the Great Circle on PS5 Pro: Ray traced native 4k at 60 fps

It's times like this where I wish I didn't have Game Pass.

Would have bought this for the Pro instead of playing through it on the Series X a few months back.

What a great game, though the ending felt a little lacklustre.
 
I just looked this up and no, they do use both terms synonymously. See settings screen below:

KwxqbaB.jpeg



The whole point of PT is to have a single holistic approach and not piecemeal RT additions. It seems that at some point devs (Nvidia included) have decided to muddy the waters just because they are approaching PT-like levels of quality. If you are a purist, you can argue that this is still not PT, just more RT


When? Where? The new patch just came out


It was the previous patch mentioning RTGI improvements to be fair.
 
It's times like this where I wish I didn't have Game Pass.

Would have bought this for the Pro instead of playing through it on the Series X a few months back.

What a great game, though the ending felt a little lacklustre.
I finished it on steam (bought full price) and it was dropping to 3fps every time it went 1mb avove vram limit... and fucking stays there until you go to settings, lower some setting back and forth to reset vram.
The game can fun 100fps, 1mb vram spill, then runs at 3fps. It's a disgrace of programming.
On top of that I had constant crashes and I could not enable RT because I have only 10gb on my 3080.

So if I knew all that... I would've waited for pro version. Now ay in hell it's 4k native 60 with with RT though. I dont believe that for one bit.
 


It was the previous patch mentioning RTGI improvements to be fair.

They have improved it further in the new patch (update 4) on all platforms. I don't believe that has been analyzed yet
  • We've improved Ray-traced Global Illumination on all platforms at no performance cost; contact shadows have been improved, helping objects appear more "grounded" in the scene
Contact shadows improvements. Nice, but it might not completely change the world for people unless, on consoles, they improve the streaming system (pop-in hurts it a bit more than non perfect contact shadows).
 
  • We've improved Ray-traced Global Illumination on all platforms at no performance cost; contact shadows have been improved, helping objects appear more "grounded" in the scene
Contact shadows improvements. Nice, but it might not completely change the world for people unless, on consoles, they improve the streaming system (pop-in hurts it a bit more than non perfect contact shadows).
Agreed. I was hoping they would throw in RT shadows for the Pro, but it seems their "full rt" components are way heavier due to their per pixel nature (and machine games' customization may not be as rigorously optimized as the core idtech features)
 
Some people here thinking the PS5 Pro could run the game with pathtracing and playable FPS are mildly delusional.
I can't even hit 60 FPS without FG at 1440P DLSS Q with my 4090 in some levels, especially at the end on the boat.
 
I finished it on steam (bought full price) and it was dropping to 3fps every time it went 1mb avove vram limit... and fucking stays there until you go to settings, lower some setting back and forth to reset vram.
The game can fun 100fps, 1mb vram spill, then runs at 3fps. It's a disgrace of programming.
On top of that I had constant crashes and I could not enable RT because I have only 10gb on my 3080.
RTGI (the same one that comes on consoles also PRO) comes by default; you don't have to activate it, just choose the level. I think you're confusing it with the Pathtracing option.

So if I knew all that... I would've waited for pro version. Now ay in hell it's 4k native 60 with with RT though. I dont believe that for one bit.
???

XSX is native 1800p, perfect 60fps with RT though. I don't understand why anyone have difficult believing in native 4K in Pro...🙇
 
RTGI (the same one that comes on consoles also PRO) comes by default; you don't have to activate it, just choose the level. I think you're confusing it with the Pathtracing option.


???

XSX is native 1800p, perfect 60fps with RT though. I don't understand why anyone have difficult believing in native 4K in Pro...🙇
ahhh so no ray tracing. So just like I played the game on my 3080 then.
Their "default rt" without pt looks flat and not like rtgi at all. Almost like they have some in-house solution and they are calling it rgti.
Ray tracing and path tracing is the exact same thing... it's just that in gaming, RT is used to name individual effects and PT is the whole thing.
 
ahhh so no ray tracing.

Is real ray tracing and very good and efficient, much more when used at 60 fps and a native resolution of ~4K.
So just like I played the game on my 3080 then.
Their "default rt" without pt looks flat and not like rtgi at all. Almost like they have some in-house solution and they are calling it rgti.
It's ray tracing, and that's why they rightly call it RTGI. Every studio has its own custom RT formula.

Ray tracing and path tracing is the exact same thing... it's just that in gaming, RT is used to name individual effects and PT is the whole thing.
So? We've known that detail for six years, it's not Indiana Jones stuff.
 
Last edited:
Is real ray tracing and very good and efficient, much more when used at 60 fps and a native resolution of ~4K.

It's ray tracing, and that's why they rightly call it RTGI. Every studio has its own custom RT formula.


So? We've known that detail for six years, it's not Indiana Jones stuff.
its so efficient because it looks like it's not there until you turn on PT....
 
LOL and that's like… 4 times the PS5 Pro power?

In raster it's this much better than GPU similar to Pro in power:

aHnKZwi.jpeg


RT difference between 9070xt and Pro GPU should actually be smaller. 9070 has 64CUs vs. 60 on Pro. Of course there is massive near 33% clock increase for 9070XT + L3 cache and all that stuff.
 
I played it on PC where the only issue was some nasty pop in of foliage, especially in the swamp area. The idTech engine is certainly amazing but it clearly needs some kind of Nanite LOD system adding to make it even better and reduce these kind of distracting (to me anyway) issues. It really is the only gripe I had with the graphical presentation, and I expect it still won't be fixed on PS5 Pro.
Going by DF in the DOOM preview the new version of id's engine have no pop in at all. Don't know if they've ported over Indiana Jones to that engine version on PlayStation but I wouldn't rule it out.
 
It's times like this where I wish I didn't have Game Pass.

Would have bought this for the Pro instead of playing through it on the Series X a few months back.

What a great game, though the ending felt a little lacklustre.
If you got it for free on Gamepass it shouldn't be such a big deal to pay for it, if you like it. I bought the premium version on Steam, worth every penny.

Edit: Ah, I get it, you don't want to do a replay but wish you had that first playthrough on a better version?

I went through it maxed out on PC. No interest in playing the Pro version but interested to hear what PS-only gamers think about it. First person isn't the typical popular genre on PS and a late release always get a muted reception, but I still think it'll get some positive talks unless people drop it fast. It was a slow burner for me, it grew on me, by the time I was in the swamp it was my GOTY and late game areas were awesome imo. I liked the ending. Like someone said, was like going through the best unreleased Indiana Jones movie.
 
Last edited:
From what I know(give me a break if I'm wrong) :

🧼 Resolution – Who's Got the Cleanest Look ?


  • PS5 Pro: Finally, a console that doesn't squint. Dynamic 4K in Quality Mode, floating around 1800p+ most of the time. Performance Mode? Still looking good at a steady ~1620p with fancy upscaling. Everything's crisp — Indy's leather jacket? Glorious. Dust particles? Chef's kiss.
  • PS5 (Base): The image is… fine. Quality Mode runs at 1440p and does its best cosplay of 4K with some temporal tricks. Performance Mode though? 1080p-ish. Soft edges, occasional shimmer, the works. If you're playing on a 4K TV, you'll know this isn't the real deal.
  • Xbox Series X: Slight edge over base PS5 in sharpness. Quality Mode hovers around 1620p; Performance dips to just above 1180p. Image reconstruction looks better than PS5's, but not quite PS5 Pro levels of sparkle.

🥇Winner: PS5 Pro. Cleanest image, highest floor resolution, least fuzz.




🚀 Frame Rate – Indy at 60fps, or Indiana Slowpoke ?


  • PS5 Pro: Both modes target 60fps, and it mostly stays there. Performance Mode is buttery. Even Quality Mode hits that 60fps target in 90% of scenarios — temples, explosions, Nazi-punching — it holds up.
  • PS5: Performance Mode tries its best but dips under 60fps often, especially when things get spicy. Quality Mode locks at 30fps and hopes you didn't notice. Spoiler: you will.
  • Xbox Series X: Performance Mode is more stable than PS5 base, but still has moments of frame drops. Quality Mode = 30fps cinematic... eh.

🥇Winner: PS5 Pro again. Smooth like Indy's hat flip.




🧵 Textures – Indy's Jacket Deserves Respect


  • PS5 Pro: 4K textures. No pop-in. Everything's sharp, from ancient ruins to forehead wrinkles. This is the version where you'll actually see the "Great Circle" carved into stone instead of guessing.
  • PS5: Textures are solid but with some LOD weirdness. Quick camera turns can trigger "whoops, forgot to load that wall" moments.
  • Xbox Series X: Slightly better than PS5 with texture streaming, but still prone to the occasional blurry mess mid-sprint.

🥇Winner: PS5 Pro. Feels like you've cleaned the dust off history itself.




🔥 Lighting, Shadows & Ray Tracing – The Real Tomb Glow-Up


  • PS5 Pro: Ray-traced shadows and ambient occlusion in Quality Mode ? Yep. It's glorious. Torches flicker realistically, caves feel like caves, and Indy actually casts a shadow that doesn't look like Play-Doh.
  • PS5: No RT. It fakes it with screen-space lighting, and it shows. Shadows are less dynamic, and GI is more "Saturday morning cartoon" than "cinematic thriller".
  • Xbox Series X: Some RT features, but nowhere near PS5 Pro levels. Better than PS5, but still missing that "wow" when light filters into a tomb.

🥇Winner: PS5 Pro. It's not even close. This is where the Pro earns its stripes.




🧠 Physics & Simulation – Because Indy Needs His Hat to Move Too


  • PS5 Pro: Better CPU and bandwidth = better AI, cloth physics, and destructibility. Ropes swing, debris reacts, and enemy guards seem a little less dumb. Not groundbreaking, but noticeable.
  • PS5 / XSX: It's all functional, but with simplified physics in chaotic moments. You'll notice it if you're paying attention… or if you replay the same scene side-by-side.

🥇Winner: PS5 Pro. Better chaos is still better.




🏁 Final Thoughts – Should You Bother on PS5 or Xbox?


Category🥇 PS5 ProPS5 BaseXbox Series X
Resolution👍 Dynamic 4K😬 1440p-ish👌 1620p
Frame Rate🚀 60fps mostly solid😩 Wobbly 60/30🤏 Slightly better
Textures🔍 4K sharpness🧵 Some pop-in📦 Better streaming
Ray Tracing💡 Full RT shadows💡 None💡 Limited
Lighting/Atmosphere🎥 Cinematic RT🕹️ Fake GI😐 Middle ground

🔚 Conclusion: If you've got a PS5 Pro, this is one of those games that makes the upgrade feel justified. Full ray tracing, stable 60fps, sharper visuals, and overall smoother presentation. On base PS5 or Series X ? You can still enjoy the adventure, but it's definitely not the same tier of visual fidelity.
 
I was hoping for a 1440p PSSR solution that allowed for full Path Tracing.


Game is like a generational leap when Path Tracing is enabled.

You can keep on hoping even when PS6 comes out. Path tracing is a no-go for 9070 XT in any game. Maybe in Cyberpunk 2077 you can do it with a low enough settings and resolution.
 
I finished it on steam (bought full price) and it was dropping to 3fps every time it went 1mb avove vram limit... and fucking stays there until you go to settings, lower some setting back and forth to reset vram.
The game can fun 100fps, 1mb vram spill, then runs at 3fps. It's a disgrace of programming.
On top of that I had constant crashes and I could not enable RT because I have only 10gb on my 3080.

So if I knew all that... I would've waited for pro version. Now ay in hell it's 4k native 60 with with RT though. I dont believe that for one bit.
no it's not

you are just refusing to use appropriate texture quality streaming settings tailored for your GPU. the game literally tells you what texture streaming to use and what settings to keep at low or medium to ensure to not go over VRAM limit at all times (for example you have to set hair quality to medium on 10 GB GPUs, and it will still look great). game has incredibly efficient and super optimized texture streaming that always respects the limits. only at medium texture streaming setting that you may notice some poor textures. at anything above high it will always load high quality textures around you and should be fine with 10 GB GPU unless you refused to set other settings that they ask you to specifically lower it for VRAM reasons. as long as you listen to what the settings tells you so, you will have smooth experience. it is a GREAT example of being a highly optimized game

it is your problem that you're trying to get more out of something limited. the whole reason for texture caching is to have better 1% lows with more VRAM thanks to less data streaming for later levels or later parts of the map. you just keep resetting the cache by changing settings and think it is a bug that it happens and game should fix itself and then complain it drops to 3 FPS again when the game expects to have a bigger VRAM cache. what you want (the fix) is there already, reducing the texture cache. all you do is resetting the cache. game tells you what to do and what the setting is for. you refuse to understand

if the game chose the appropriate texture quality streaming for you and never gave you the option, you wouldn't even know or notice (like star wars outlaws and some other games are doing for example)

honestly get off PC gaming, it is just not for you. you tell people to "this is pc gaming!! you should be able to optimize settings for your hardware!!" then refuse the respect game's highly optimized competent texture streamer and try to use settings higher than what your VRAM is capable of despite the game in game settings warning you. just because it runs fine when you initially run the game or change settings to reset the cache doesn't change the fact that you don't have enough VRAM for those settings. this happens all the time with all the games, I can tell you at least 20 games that run great at 4K with 8 GB VRAM initially but gets broken later on. it is just a VRAM limitation, it has nothing to do with it being "disgrace". texture caching is there for a reason.

only reason the option is there is to give people flexibility but you take that flexibility and use it against the game as "disgrace". no. people with 3080 who plays at 1080p or 1440p, they can use a higher texture cache setting, only reason you can change that setting is because of that.

even if you want game to pick the option for you, the result wouldn't be different. it is literally telling you what to do. same with ac shadows (it specifically tells you to use medium texture streaming setting in that game if you have a GPU with less than 12 GB VRAM). and guess what it runs fine on my 8 GB GPU, for 20 minutes. then stutters and framedrops. I use medium and it is all fine, for hours. and I still get nice quality textures because it is still a competent texture streamer

I played that game at 1440p with my 3070 getting 75+ FPS all the time with texture streaming set to medium and hair quality set to low (rest of the settings were high to very high). I didn't experience drops to 3 FPS, not even once.

also disable steam's hardware acceleration if you want to get more VRAM for the game to use
 
Last edited:
no it's not

you are just refusing to use appropriate texture quality streaming settings tailored for your GPU. the game literally tells you what texture streaming to use and what settings to keep at low or medium to ensure to not go over VRAM limit at all times (for example you have to set hair quality to medium on 10 GB GPUs, and it will still look great). game has incredibly efficient and super optimized texture streaming that always respects the limits. only at medium texture streaming setting that you may notice some poor textures. at anything above high it will always load high quality textures around you and should be fine with 10 GB GPU unless you refused to set other settings that they ask you to specifically lower it for VRAM reasons. as long as you listen to what the settings tells you so, you will have smooth experience. it is a GREAT example of being a highly optimized game

it is your problem that you're trying to get more out of something limited. the whole reason for texture caching is to have better 1% lows with more VRAM thanks to less data streaming for later levels or later parts of the map. you just keep resetting the cache by changing settings and think it is a bug that it happens and game should fix itself and then complain it drops to 3 FPS again when the game expects to have a bigger VRAM cache. what you want (the fix) is there already, reducing the texture cache. all you do is resetting the cache. game tells you what to do and what the setting is for. you refuse to understand

if the game chose the appropriate texture quality streaming for you and never gave you the option, you wouldn't even know or notice (like star wars outlaws and some other games are doing for example)

honestly get off PC gaming, it is just not for you. you tell people to "this is pc gaming!! you should be able to optimize settings for your hardware!!" then refuse the respect game's highly optimized competent texture streamer and try to use settings higher than what your VRAM is capable of despite the game in game settings warning you. just because it runs fine when you initially run the game or change settings to reset the cache doesn't change the fact that you don't have enough VRAM for those settings. this happens all the time with all the games, I can tell you at least 20 games that run great at 4K with 8 GB VRAM initially but gets broken later on. it is just a VRAM limitation, it has nothing to do with it being "disgrace". texture caching is there for a reason.

only reason the option is there is to give people flexibility but you take that flexibility and use it against the game as "disgrace". no. people with 3080 who plays at 1080p or 1440p, they can use a higher texture cache setting, only reason you can change that setting is because of that.

even if you want game to pick the option for you, the result wouldn't be different. it is literally telling you what to do. same with ac shadows (it specifically tells you to use medium texture streaming setting in that game if you have a GPU with less than 12 GB VRAM). and guess what it runs fine on my 8 GB GPU, for 20 minutes. then stutters and framedrops. I use medium and it is all fine, for hours. and I still get nice quality textures because it is still a competent texture streamer
I used the medium texture streaming setting... and played at fucking 1440p... and then with dlss. And it would still go over vram occasionally.
The game was coded by ants. I was also playing with lower shadows because its another vram heavy settings. And no pt of course.
it's not normal for game to go from cozy, smooth 70fps to 3 because it's 10mb over vram limit. No matter what settings I am using. user is n ever wrong.

And really? You are telling me to get off pc because a game is coded like shit? I told you, all you have to do is go in options switch some setting off and on and it will free the vram.
Fuck your high horse pc gaming mentality man. It's fucked you are blaming a user. Especially life long one. I've been playing on pc since 1997 you asshole and I will continue to complain about it whenever I see fit.
I hate when any one of pc guys on this forum starts attacking me and my pc credentials because a game is trash. This is the biggest trigger for me. Just cmon have some fucking self respect.
It's fine there is "cache" setting but it doesn't work right, doesn't tell me how much it caches. The game KNOWS how much vram I have. There is 0 reasons for it to go above that and not have any other measures to avoid that.

It's not "my problem".
Stop fucking blaming me for poor state of pc gaming and some amateur programmers. I know my way around pc settings and everything around pc more than anyone of you so called pc guys here. I've seen it all.
I've been there when 3d accelerators were created. I still have my fucking monster 3d.
I came all the way from s3 and 3dfx at 640x480 to where we are today. There is nothing I don't know about pc gaming. And I know for sure that if Game is allocating more vram than it should, it shouldn't fucking drop dead to 3fps in an instant.
And the fact that even pause menu runs at 3fps is a joke.
THIS is the reason I like to play on consoles. because no game goes to 3fps and then stays there until I do something.
And the vram thing in indiana jones? It happens on every gpu. not just 10gb 3080. It's just the same on 16gb cards too if you use pt.
It's their fault and their fault alone. I am just a dude, playing a game on medium streaming cache like the game vram bar usage recommended and it still goes to hell.
it shouldn't be expecte dof me to read steam forums and pc gaming wiki to learn that the vram bar usage is cheating on me. I have the rtss running right there and I see what's happening.
 
it never did for me on my 3070, seems like a you problem

have a good day
sure... you probably played on 1080p with dlss and then on lower settings. We would need to compare whole systems and settings and I finished the game on release and forgot about it. It was not too good anyway. very forgettable.
and 3070 got 12gb vram.

So yeah. of course it's my fault and you are a genius who manages to have NO PROBLEMS at all and somehow my experience is anecdotal and your experience is a proof that the game if fine.
Ignoring my whole post because "i have no issues".
Like.. eat the bag of dicks you rode this morning and choke on it.

Always the same deal with you guys.
You are the smart ones - I am the idiot
You have good experience - FACT, PC GOOD
I have bad experience - anecdotal, no evidence, I am stupid. None of my 30years of experience matters...
not that any of that matters because it's fucking video games.
it only looks like rocket science to you because of how proud you are of figuring our basic things. None of this is complicated. None of this is difficult..... well except for streamers. Somehow these idiots still don't even know what dlss is
 
it never did for me on my 3070, seems like a you problem

have a good day
also "have a good day" ?! really?
You really think you won some argument here kid?
"nice. I got him! that will show him!"

You are this petty?
I described you my whole argument. Experience with the game .... THAT I BOUGHT AND FINISHED... and you ignored it because in your head, pc gaming is some rocket science and I don't understand it?
edit: whatever
GOOD LUCK AND HAVE A NICE DAY
 
Last edited:
sure... you probably played on 1080p with dlss and then on lower settings. We would need to compare whole systems and settings and I finished the game on release and forgot about it. It was not too good anyway. very forgettable.
and 3070 got 12gb vram.
no, I played at 1440p dlss quality with medium texture streaming and hair quality set to low, global illumination and shadows set to medium, rest of the settings were high and some settings set to very high
and my GPU has 8 GB VRAM

I played for hours and never got any FPS drop. I can, if you want, prove it with a 1 hour benchmark video (at settings I've talked about above)
 
Last edited:
no, I played at 1440p dlss quality with medium texture streaming and hair quality set to low, shadows set to medium, rest of the settings were high and some settings set to very high
and my GPU has 8 GB VRAM

I played for hours and never got any FPS drop. I can, if you want, prove it with a 1 hour benchmark video (at settings I've talked about above)
I believe you.
The problem is that you don't believe me and say "it's a you problem"
meanwhile this is well documented even in df video.

edit: It was also day1. So it was more bugged. And i played it before I upgraded to 5700x3d but that doesnt change vram situation
 
Last edited:
From what I know(give me a break if I'm wrong) :

🧼 Resolution – Who's Got the Cleanest Look ?


  • PS5 Pro: Finally, a console that doesn't squint. Dynamic 4K in Quality Mode, floating around 1800p+ most of the time. Performance Mode? Still looking good at a steady ~1620p with fancy upscaling. Everything's crisp — Indy's leather jacket? Glorious. Dust particles? Chef's kiss.
  • PS5 (Base): The image is… fine. Quality Mode runs at 1440p and does its best cosplay of 4K with some temporal tricks. Performance Mode though? 1080p-ish. Soft edges, occasional shimmer, the works. If you're playing on a 4K TV, you'll know this isn't the real deal.
  • Xbox Series X: Slight edge over base PS5 in sharpness. Quality Mode hovers around 1620p; Performance dips to just above 1180p. Image reconstruction looks better than PS5's, but not quite PS5 Pro levels of sparkle.

🥇Winner: PS5 Pro. Cleanest image, highest floor resolution, least fuzz.




🚀 Frame Rate – Indy at 60fps, or Indiana Slowpoke ?


  • PS5 Pro: Both modes target 60fps, and it mostly stays there. Performance Mode is buttery. Even Quality Mode hits that 60fps target in 90% of scenarios — temples, explosions, Nazi-punching — it holds up.
  • PS5: Performance Mode tries its best but dips under 60fps often, especially when things get spicy. Quality Mode locks at 30fps and hopes you didn't notice. Spoiler: you will.
  • Xbox Series X: Performance Mode is more stable than PS5 base, but still has moments of frame drops. Quality Mode = 30fps cinematic... eh.

🥇Winner: PS5 Pro again. Smooth like Indy's hat flip.




🧵 Textures – Indy's Jacket Deserves Respect


  • PS5 Pro: 4K textures. No pop-in. Everything's sharp, from ancient ruins to forehead wrinkles. This is the version where you'll actually see the "Great Circle" carved into stone instead of guessing.
  • PS5: Textures are solid but with some LOD weirdness. Quick camera turns can trigger "whoops, forgot to load that wall" moments.
  • Xbox Series X: Slightly better than PS5 with texture streaming, but still prone to the occasional blurry mess mid-sprint.

🥇Winner: PS5 Pro. Feels like you've cleaned the dust off history itself.




🔥 Lighting, Shadows & Ray Tracing – The Real Tomb Glow-Up


  • PS5 Pro: Ray-traced shadows and ambient occlusion in Quality Mode ? Yep. It's glorious. Torches flicker realistically, caves feel like caves, and Indy actually casts a shadow that doesn't look like Play-Doh.
  • PS5: No RT. It fakes it with screen-space lighting, and it shows. Shadows are less dynamic, and GI is more "Saturday morning cartoon" than "cinematic thriller".
  • Xbox Series X: Some RT features, but nowhere near PS5 Pro levels. Better than PS5, but still missing that "wow" when light filters into a tomb.

🥇Winner: PS5 Pro. It's not even close. This is where the Pro earns its stripes.




🧠 Physics & Simulation – Because Indy Needs His Hat to Move Too


  • PS5 Pro: Better CPU and bandwidth = better AI, cloth physics, and destructibility. Ropes swing, debris reacts, and enemy guards seem a little less dumb. Not groundbreaking, but noticeable.
  • PS5 / XSX: It's all functional, but with simplified physics in chaotic moments. You'll notice it if you're paying attention… or if you replay the same scene side-by-side.

🥇Winner: PS5 Pro. Better chaos is still better.




🏁 Final Thoughts – Should You Bother on PS5 or Xbox?


Category🥇 PS5 ProPS5 BaseXbox Series X
Resolution👍 Dynamic 4K😬 1440p-ish👌 1620p
Frame Rate🚀 60fps mostly solid😩 Wobbly 60/30🤏 Slightly better
Textures🔍 4K sharpness🧵 Some pop-in📦 Better streaming
Ray Tracing💡 Full RT shadows💡 None💡 Limited
Lighting/Atmosphere🎥 Cinematic RT🕹️ Fake GI😐 Middle ground

🔚 Conclusion: If you've got a PS5 Pro, this is one of those games that makes the upgrade feel justified. Full ray tracing, stable 60fps, sharper visuals, and overall smoother presentation. On base PS5 or Series X ? You can still enjoy the adventure, but it's definitely not the same tier of visual fidelity.
What's the source/s for this? Back your claim up asap please. The base PS5 info in particular doesn't make sense at all, unless there is something fundamentally wrong with this version.
 
Last edited:
From what I know(give me a break if I'm wrong) :

🧼 Resolution – Who's Got the Cleanest Look ?


  • PS5 Pro: Finally, a console that doesn't squint. Dynamic 4K in Quality Mode, floating around 1800p+ most of the time. Performance Mode? Still looking good at a steady ~1620p with fancy upscaling. Everything's crisp — Indy's leather jacket? Glorious. Dust particles? Chef's kiss.
  • PS5 (Base): The image is… fine. Quality Mode runs at 1440p and does its best cosplay of 4K with some temporal tricks. Performance Mode though? 1080p-ish. Soft edges, occasional shimmer, the works. If you're playing on a 4K TV, you'll know this isn't the real deal.
  • Xbox Series X: Slight edge over base PS5 in sharpness. Quality Mode hovers around 1620p; Performance dips to just above 1180p. Image reconstruction looks better than PS5's, but not quite PS5 Pro levels of sparkle.

🥇Winner: PS5 Pro. Cleanest image, highest floor resolution, least fuzz.




🚀 Frame Rate – Indy at 60fps, or Indiana Slowpoke ?


  • PS5 Pro: Both modes target 60fps, and it mostly stays there. Performance Mode is buttery. Even Quality Mode hits that 60fps target in 90% of scenarios — temples, explosions, Nazi-punching — it holds up.
  • PS5: Performance Mode tries its best but dips under 60fps often, especially when things get spicy. Quality Mode locks at 30fps and hopes you didn't notice. Spoiler: you will.
  • Xbox Series X: Performance Mode is more stable than PS5 base, but still has moments of frame drops. Quality Mode = 30fps cinematic... eh.

🥇Winner: PS5 Pro again. Smooth like Indy's hat flip.




🧵 Textures – Indy's Jacket Deserves Respect


  • PS5 Pro: 4K textures. No pop-in. Everything's sharp, from ancient ruins to forehead wrinkles. This is the version where you'll actually see the "Great Circle" carved into stone instead of guessing.
  • PS5: Textures are solid but with some LOD weirdness. Quick camera turns can trigger "whoops, forgot to load that wall" moments.
  • Xbox Series X: Slightly better than PS5 with texture streaming, but still prone to the occasional blurry mess mid-sprint.

🥇Winner: PS5 Pro. Feels like you've cleaned the dust off history itself.




🔥 Lighting, Shadows & Ray Tracing – The Real Tomb Glow-Up


  • PS5 Pro: Ray-traced shadows and ambient occlusion in Quality Mode ? Yep. It's glorious. Torches flicker realistically, caves feel like caves, and Indy actually casts a shadow that doesn't look like Play-Doh.
  • PS5: No RT. It fakes it with screen-space lighting, and it shows. Shadows are less dynamic, and GI is more "Saturday morning cartoon" than "cinematic thriller".
  • Xbox Series X: Some RT features, but nowhere near PS5 Pro levels. Better than PS5, but still missing that "wow" when light filters into a tomb.

🥇Winner: PS5 Pro. It's not even close. This is where the Pro earns its stripes.




🧠 Physics & Simulation – Because Indy Needs His Hat to Move Too


  • PS5 Pro: Better CPU and bandwidth = better AI, cloth physics, and destructibility. Ropes swing, debris reacts, and enemy guards seem a little less dumb. Not groundbreaking, but noticeable.
  • PS5 / XSX: It's all functional, but with simplified physics in chaotic moments. You'll notice it if you're paying attention… or if you replay the same scene side-by-side.

🥇Winner: PS5 Pro. Better chaos is still better.




🏁 Final Thoughts – Should You Bother on PS5 or Xbox?


Category🥇 PS5 ProPS5 BaseXbox Series X
Resolution👍 Dynamic 4K😬 1440p-ish👌 1620p
Frame Rate🚀 60fps mostly solid😩 Wobbly 60/30🤏 Slightly better
Textures🔍 4K sharpness🧵 Some pop-in📦 Better streaming
Ray Tracing💡 Full RT shadows💡 None💡 Limited
Lighting/Atmosphere🎥 Cinematic RT🕹️ Fake GI😐 Middle ground

🔚 Conclusion: If you've got a PS5 Pro, this is one of those games that makes the upgrade feel justified. Full ray tracing, stable 60fps, sharper visuals, and overall smoother presentation. On base PS5 or Series X ? You can still enjoy the adventure, but it's definitely not the same tier of visual fidelity.

so... uhm... all of this is made up, verifiably so even given that we have hard numbers for the Xbox version.

there are no graphics modes on Xbox, there is no dynamic resolution on Xbox it's just a locked 1800p 60fps.
there is no way to play the game without Raytracing in any way as the game has no rasterised fallback for GI, which is why even the Series S has RT GI.
The PS5 Pro has the exact same amount of memory as the Series X and PS5, why would there be texture pop-in differences or texture quality differences between them? there's no logic to that.
and wtf are you talking about with that physics bullshit?...

there is so much wrong with that comment that it should be obvious it's all made up within the first paragraph.

so why the actual fuck did you put so much effort into a post that is entirely bullshit and made up? literally not a single data point on this graph is even remotely close to being correct. the most hilariously retarded part being the physics portion... like wtf?
 
Last edited:
And the vram thing in indiana jones? It happens on every gpu. not just 10gb 3080. It's just the same on 16gb cards too if you use pt.
It never happened on my 4090 but yeah the 4080 Super choked just like you said when using path-tracing, 16GB there. I too have expressed anger for that many times.

But after tweaking the settings the extreme drops went away for me. Still looked great and played great. No need for long rants about it imo.

And i played it before I upgraded to 5700x3d but that doesnt change vram situation

Vram situation is what it is but I experienced some framerate instability on another PC with a 4090, and no x3D CPU. Thought the gsync tech were malfunctioning, got micro-stutter when rotating the camera.
Upgraded the CPU recently and checked Indy today and now it's absolute butter smooth. So I assume the instability was the 1% lows people talk about.

So, don't know your general experience with the game but just a heads up, could be worth it to start it up and have a spin just to check how your x3D upgrade affect things. Seems to have been some updates since last year, noticed that there was another ray tracing setting added too, ray reconstruction for all lights. Cranked it on a still butter smooth. I need to check what it's about, torches and muzzle flashes perhaps?
 
Last edited:
Forbes, Push Square, The Verge.

so I assume Forbes and Push Square are your left and right ass cheeks, and The Verge is what you call your chocolate starfish...
because that's where you got your infos from is you ask me.

not a word of what you said makes even a speck of sense. and even tho I expect less than the bare minimum from any of the outlets mentioned, I doubt that even they are too stupid to get the correct data at least for the Xbox versions, given that that data is easily available online.
 
Last edited:
Some people here thinking the PS5 Pro could run the game with pathtracing and playable FPS are mildly delusional.
I can't even hit 60 FPS without FG at 1440P DLSS Q with my 4090 in some levels, especially at the end on the boat.
They have no technical knowledge of how this stuff works. Once they hear Pro they think they have a 5090.
 
Last edited:
Top Bottom