• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield PC - Digital Foundry Tech Review - Best Settings, Xbox Series X Comparisons + More

sinnergy

Member
But then the engine boggles the mind





The persistence and physics of objects are next gen

Is it worth the cost for how it looks? I don’t think so personally. But what a weird fucking flex. Imagine the engine engineer spending all that coding ressources that only hoarders will see how neat it is. 99% of peoples will never see it.

But Star field looks fine for what it is? Great even most of the time .
 
Last edited:

Bojji

Member
You do realize that neither AMD, nvidia or Intel optimize games. Their sponsorships are to implement things FSR, DLSS, XeSS, CaCAO, RTX, etc.
There are plenty of games that are very well optimized that were sponsored by AMD. For example Dead Island 2 is probably the best optimized UE4 game in several years.
And there are also several games sponsored by nvidia, that are poorly optimized. Like BF2042, Redfall, Gollum, Warharmmer 40K Darktide, CP 2077, etc...

Dead Island being UE4 saved it from bad performance on Nvidia.

Usually AMD sponsored games are optimized for AMD hardware, mostly thanks to AMD hardware on consoles. Developers don't care much how their games run on hardware from different vendors, while without any sponsorships games usually work like in relative performance charts where 3080=6800XT.

Nvidia have crap releases too (some games on your list are just really bad in general) but Cyberpunk for example runs quite strong on RDNA without RT from day one. CDPR also implements every tech available from all vendors.
 

SABRE220

Member
But then the engine boggles the mind





The persistence and physics of objects are next gen

Is it worth the cost for how it looks? I don’t think so personally. But what a weird fucking flex. Imagine the engine engineer spending all that coding ressources that only hoarders will see how neat it is. 99% of peoples will never see it.

Again this can be replicated on Skyrim and oblivion how is this nextgen..
 

Danknugz

Member
"- AMD GPUs run MUCH better than Intel/NVIDIA GPUs, outperforming their counterparts by up to 40%. The 6800 XT for instance consistently outperforms the 3080 by over 25%"

This shit is just ridiculous. Fuck AMD and fuck Bethesda for parterning with this cancer.

Nvidia is greedy, but what AMD is doing is downright criminal
you sure about that? I think AMD just proved to everyone that their cards are vastly superior for much cheaper price. I for one am convinced.
 

Hugare

Member
you sure about that? I think AMD just proved to everyone that their cards are vastly superior for much cheaper price. I for one am convinced.
You dont think thats curious how this is the only game where AMD cards outperforms their counterparts by 40%? Often loosing, even.

If thats enough to convince you, well ... ok
 

winjer

Member
Dead Island being UE4 saved it from bad performance on Nvidia.

Usually AMD sponsored games are optimized for AMD hardware, mostly thanks to AMD hardware on consoles. Developers don't care much how their games run on hardware from different vendors, while without any sponsorships games usually work like in relative performance charts where 3080=6800XT.

Most games run very similarly on the 3080 and 6800XT. Be it sponsored by either brand.
Starfield is an exception. But this case seems to be similar to Forza 5, a game that for almost a year ran much better on AMD's GPUs.
But that wasn't because of any sponsorship. It was because NVidia didn't whitelist the game for the use of Rebar on their drivers.
The moment nvidia did that, there was parity between nvidia and AMD.
And from what we see from people that have enabled Rebar, on nvidia GPUs with Starfield, it can gain ~20% performance. Which means parity between AMD and nvidia, again.

Nvidia have crap releases too (some games on your list are just really bad in general) but Cyberpunk for example runs quite strong on RDNA without RT from day one. CDPR also implements every tech available from all vendors.

Not really. CP2077 and CDPR give preferential treatment for nvidia. Any time nvidia releases a new tech, for example a new version of DLSS or RTX, CDPR implements it.
On the other hand, the game is still using FSR2.1, despite the 2.2 version had been released a year ago. And it's still using XeSS 1.1
 

Buggy Loop

Member
Again this can be replicated on Skyrim and oblivion how is this nextgen..

Does it? With no mods? I can’t recall seeing something as impressive but again, the older games are so old that I probably got hit by Alzheimer’s in the meantime
 

GymWolf

Member
Forcing anisotropic to 16x only causes shadow issue if shaders are already compiled, you just need to delete them and there will be no shadow issue.
I force it globally in the drivers since forever and never saw that issue in Starfield.

Also no mention of forcing ReBAR on Nvidia cards, which could lower the performance difference against equivalent AMD cards, the Game Ready Nvidia driver was not that ready in the end.
How do i cancel the shaders?
 

Lokaum D+

Member
Again this can be replicated on Skyrim and oblivion how is this nextgen..
i ll show how this this is nextgen
200.gif
 
Last edited:

GymWolf

Member
But then the engine boggles the mind





The persistence and physics of objects are next gen

Is it worth the cost for how it looks? I don’t think so personally. But what a weird fucking flex. Imagine the engine engineer spending all that coding ressources that only hoarders will see how neat it is. 99% of peoples will never see it.

Do i need to repost the gif of skyrim and the 10.000 cheseroll or the crysis one with the barrels?

Those examples are made with an editor or people wasting dozens of hours to pick up 10.000 potatoes, during normal gameplay you are never gonna see any of that.


I'm not sure why some call that nextgen when it is at best a neat trick that was possible almost 15 years ago...

Like, at a certain point you have to left the engine go if that is the features that cockblock SOO many other features of modern engines.

And from my understanding, there are other mothods to achieve object persistency with other engines without completely fucking up other things.
 
Last edited:

Denton

Member
I am playing and just visited some Mars base (in the main quest, like second or third mission) and I am morbidly fascinated that this is how the game looks there, like someone made this, played this, and was like "yep that's good enough to ship in 2023" and that someone happens to be the largest multitrilion dollar corporation on earth

starfieldscreenshot203ein9.png


starfieldscreenshot204kfcr.png



At least the outside and some other bases look better, but still
starfieldscreenshot20k6iyw.png
 

rofif

Can’t Git Gud
Lmao.


Also what are those optimized settings? Mostly low/medium? Starfield for peasants?

johnny depp ew GIF
That's my problem with some pc games.
There are often settings just FOR THE SAKE OF IT.
No visual change whatsoever but some fps change. or sometimes not even any fps change.
It's just wasting out time. Why have every setting artificially divided in 5 levels, while only low and medium have visible effects ?
 

Bojji

Member
Game looks insanely flat on console.

On PC you can at least change awful colors and have multiple tools to do so, I use Nvidia Freestyle as it's the most non intrusive and not changing original content much:

Vanilla:

51LPaTf.jpg


With contrast and exposition tweaks:

nvDyPvT.jpg


Good ole Vaseline Rendering Solution.

The second worst thing ever invented in video game graphics history, first is chromatic aberration.
 

Panajev2001a

GAF's Pleasant Genius
But then the engine boggles the mind





The persistence and physics of objects are next gen

Is it worth the cost for how it looks? I don’t think so personally. But what a weird fucking flex. Imagine the engine engineer spending all that coding ressources that only hoarders will see how neat it is. 99% of peoples will never see it.

Persistence (inside each area) yes, but thousands of simple rigid bodies interacting is not the nextest-gen of next-gen IMHO…
 

Zuzu

Member
I can’t wait to see the meltdowns when they inevitably add ray tracing to this engine in future games. The (terrible) performance will be legendary.
 

Zuzu

Member
Didn't expect PC load times to be that much faster than Series X you'd think they would use the Velocity Architecture/direct storage

Yeah the loading can be really slow on the Series X at times. I’ve also had the game freeze completely twice so far on the Series X and a full crash to the home screen so I think it still needs some work. Not sure if they can fix the loading times though.
 
Last edited:

shamoomoo

Member
You would think an exclusive owned game would take advantage of the platform's tech. Makes for better games, according to Todd.
But it wasn't initially exclusive. Come to think about it, it's possible if the PS5 version still existed ,it would probably have long load times too.
 
Last edited:
FSR2 is a joke.

The problem is not that FSR is bad; the issues shown are only really noticeable in zoomed in footage or when the game is slowed down to highlight the issues and most people likely won't notice them. It's adequate at 4K on Quality in my experience. The problem is that a better option exists that just emphasises all the weaknesses of FSR and I suspect this is why AMD do not generally want DLSS in their sponsored games because it makes it look "bad".

People complain about DLSS being proprietary and requiring RTX hardware to run but this is a big part of what makes it better than AMD's FSR. I can pretty much guarantee that the same will be true of FSR3 frame generation when it releases; it'll be good enough but it won't be as good as DLSS3. And, of course, NVIDIA have had, what, an 18 month head start of upscaling tech as well which means that AMD are always going to be playing catch-up.
 

SlimySnake

Flashless at the Golden Globes
Again this can be replicated on Skyrim and oblivion how is this nextgen..
Not at this level of fidelity though and thats the key. the visual upgrade is massive. the physics interactions have also been vastly improved.

Do i need to repost the gif of skyrim and the 10.000 cheseroll or the crysis one with the barrels?

no need. here it is:
j2MtZY.gif

ef0.gif

utV0UCx.gif

D4jjmP2.gif

Xg2TjvJ.gif


If you cant see the generational leap in fidelity in both the visuals as well the physics then i cant help you.
Those examples are made with an editor or people wasting dozens of hours to pick up 10.000 potatoes, during normal gameplay you are never gonna see any of that.
this i can get behind. bethesda needs game designers who actually utilized this physics features that are making their games feel so unoptimized.
 

SlimySnake

Flashless at the Golden Globes
FSR2 is a joke.
i played the first 12 hours on FSR and honestly only noticed the shimmering a few times. the game looked phenomenal. most people dont zoom in like that and with motion blur on, you wont notice the ghosting of NPCs and enemies in motion anyway.

I played checkerboarding on my PS4 Pro all last gen up until 2019 when i started playing on dlss. Checkerboarding is actually worse than fsr and yet it resolved to a much better image on my 4k tvs than 1080p games like Batman AK and DriveClub did.

In normal gameplay, it is really not that bad. especially in starfield where you spend most of the time indoors fighting in ships and inside outposts.

that said, DLSS is better and shouldve been included since day 1. But if someone came to me and asked if they should spend $800 on a 4070 ti or $500 on a 7800xt JUST for this game, i would not hesitate to suggest the 7800xt. Same performance for $300 less? not even close. FSR2 is a great option for AMD cards and consoles.
 

SlimySnake

Flashless at the Golden Globes
Though nothing came out of this,is anybody else reminded of the Havoc physics demo on the PS4?


I think cernys words in this very video explained why nothing came out of this. he said it runs primarily on the GPU and no dev in their right mind would make their game look like fallout 4 or skyrim just so they can get physics like that.

this is precisely why every almost game last gen ditched physics and destruction because they would have to do it on the GPU and no one wanted to lose visual wars to other studios.

same reason why everyone ditched realtime GI last gen including epic. Fable devs actually had it working on the base xbox one and even demo'd it before they were shutdown by phil spencer. but it came at a cost on the GPU and devs chose to bake lighting to save on performance and used it elsewhere to make prettier looking worlds. static and boring, but pretty.
 

shamoomoo

Member
I think cernys words in this very video explained why nothing came out of this. he said it runs primarily on the GPU and no dev in their right mind would make their game look like fallout 4 or skyrim just so they can get physics like that.

this is precisely why every almost game last gen ditched physics and destruction because they would have to do it on the GPU and no one wanted to lose visual wars to other studios.

same reason why everyone ditched realtime GI last gen including epic. Fable devs actually had it working on the base xbox one and even demo'd it before they were shutdown by phil spencer. but it came at a cost on the GPU and devs chose to bake lighting to save on performance and used it elsewhere to make prettier looking worlds. static and boring, but pretty.
If a game was going for a Zelda like look it could have similar physics like that he Havoc demo.
 

LiquidMetal14

hide your water-based mammals
I haven't finished the video but has anyone noticed how the lighting on characters models basically non existent other than the prebaked story scenes. I keep looking for proper real time shadows and am disappointed this doesn't have that. It breaks immersion when you see it walking with main characters and it's off for literally every other interaction with the flashlight.
 
Last edited:

Lokaum D+

Member
can someone explain to me why gamepass version runs worse then steam ?

i cant menage to maintain 30fps even at low settings on gamepass version, when i m locked at 30fps medium on steam
 

Elysium44

Banned
SF core design still old, technically the game is old.

I'm aware, my point is that if even a much hyped first party game published by the richest company in the world doesn't utilise one of their own console's big selling points, three years after release, then it isn't a good look.
 

Killer8

Member
As usual a lot of complaints about the engine ("just use Unreal bro"). I don't particularly see anything wrong with it. Yes there are rough areas here and there, but the scope of the game is absolutely gigantic so it's to be expected that some areas will be less developed than others. Overall I find it to be a good looking game more often than not. The PBR materials in particular look great and the modelling of everything down to the smallest of items is well done. Just compare your average interior of Fallout 4 to Starfield and it's a gigantic leap forward.

The physics gifs posted above are impressive but what people also forget about the engine is that it keeps track of the position of everything. You can have 1000 rolls of toilet paper in a store room but more importantly they will persist indefinitely. Travel to the other end of the universe and back and they will still be there. It's the reason why save file sizes and loading times bloat over the course of the adventure. The state of everything is kept constant and it's what makes Bethesda's games completely unique in the industry.

Phil Spencer really is one of the worst Creative Managers in the videogame business. Dude needs to exit videogames and business relation managment. Someone at Xbox Game Studios needs to step in and either force Bethesda to use a new inhouse engine (IDTech7+) or go with Unreal Engine or some thing else new. 21 years after Creation Engine was created for Morrowind, it's time in the sun is over.

This is peak Spencer Derangement Syndrome. That an executive should have any say over the engine used in the game is frankly ludicrous, not least because Xbox only acquired Bethesda in 2020 and Starfield was already 5 years in development by that point. What did you expect Bethesda to do? Scrap everything and move to id Tech 7 because Spencer told them to? Just lmao.

Besides, id Software worked on the engine side of Starfield anyway.
 

acm2000

Member
So there's no slow loading? Why are people saying there is then 🤔
slow load is just the first time you boot the game from new, after that quick resume takes over and as far as the game knows its still on the first boot.

only time you will ever see that cold boot again is a full hard crash which is not common.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
So there's no slow loading? Why are people saying there is then 🤔

The first load from the main menu into the game is very long.

Quick Resume is a system level feature where even if you shut your console down, it resumes exactly where you left it. Like a persistent saved state of the game before turning the console off.

You're basically bypassing the initial logo, boot menus and the first long load time.
 
Last edited:
Top Bottom