SEGAvangelist
Member
It worked for Titanfall!Lol that awesome 10 % gain from 720p to 792p woooww
It worked for Titanfall!Lol that awesome 10 % gain from 720p to 792p woooww
I think that’s on the devs then.720p and can't even hold 60fps.
Man how quickly the dream of this gen being at least 1080p 60fps died.
And this isn't even using Lumen, once we get UE5 games using the full feature set we'll probably see console games running at 480p 25fps.
Honestly I've yet to see a single next gen game with visuals that seem worth the drop in resolution and framerate. We seem to have reached the point where getting a game to look 10% better needs 200% stronger hardware and it's not worth it.
Of course TSR is much better than FSR. FSR is a PR hoax.They must be using the latest version of Unreal's TSR. TSR > FSR. I wonder if TSR beats XESS.
It's upscaled to 1440p well enough to compare to PC running in native 1440p, so if no one told you it was native 720p, you would've never known.720p is a fucking joke. Maybe decent on a 27" monitor but not on a 55" TV or larger.
It's a creative choice. Not a hardware issue
No, how about they actually optimize it properly?!Help me @MidGenRefresh gimme my pro
Have you actually played any of these games recently? FF16 for instance also supposodely drops to 720p, never once I felt the image was blurred or with artifacts. Fear mongering is so stupid, go play the games instead of being angry at pixels.It's not. It looks similar in stills. Did you guys just learn of image reconstruction yesterday? In motion is where the problem is.
I don't have an xbox so I don't keep up with how games run on thatYou forgot to say xsx too, that was probably intentional though.
And I heard people say the exact opposite. This is a technical discussion. I don't care about your feelings and no one is angry at pixels.Have you actually played any of these games recently? FF16 for instance also supposodely drops to 720p, never once I felt the image was blurred or with artifacts. Fear mongering is so stupid, go play the games instead of being angry at pixels.
I think that’s on the devs then.
Well we see how thats working out the further we get into this generation so give me more power in my consolesNo, how about they actually optimize it properly?!
Seems to work beautifully with Fortnite.Unreal engine 3 had performance issues, unreal engine 4 had performance issues, unreal engine 5 has performance issues. I’m sure they will get it right with unreal engine 6.
If the only people that can get the engine running well are the ones that who created the engine, maybe there is a problem with the engine. I don’t see any of these issues with decima or source engine.Seems to work beautifully with Fortnite.
I don’t disagree. But I also don’t think devs are likely taking the time to optimize better. Epic has a financial reason to make its tent pole title run and look great on the UE6. I’m sure they spent a stupid amount of time and money at the engine. I don’t think most devs will take the time and just be ok with the consoles tech and the game engine to do the heavy lifting.If they only people that can get the engine running well are the ones that who created the engine, maybe there is a problem with the engine. I don’t see any of these issues with decima or source engine.
Another 720p game in performance mode on PS5 just like FF16. I'm sensing a trend here in 2023...
I never stopped. I think I’ve always used quality modes.That sound you hear is a LOT of people reacquainting themselves with 30fps.
Me included.
Definitely the reason. Optimization is the last thing devs do and these games are coming in too hot. Plague tale proved all the devs need is more time to deliver a good performance mode. I blame Sony and Microsoft for not putting better quality standards on game graphic modes.I don’t disagree. But I also don’t think devs are likely taking the time to optimize better. Epic has a financial reason to make its tent pole title run and look great on the UE6. I’m sure they spent a stupid amount of time and money at the engine. I don’t think most devs will take the time and just be ok with the consoles tech and the game engine to do the heavy lifting.
Has to be asset quality, some sections really do look pre-rendered but the visuals as a whole are all over the place, especially the character models.Apparently this game can only run slightly above 60fps native at 1440p on a 4090.
Ooff.
1280x720What's the base resolution for dlss performance @ 1440?
In still shots sure and likely a reason why you can't disable motion blurI was just watching the video and was very surprised when I heard 792p on balanced.
I never would have guessed. Impressive upscaling.
The video is a still shot?In still shots sure and likely a reason why you can't disable motion blur
Maybe the game is good, but going strictly on adamapple's recap, game's technical specs and performance sound major shit-o-la.
Maybe the game is good, but going strictly on adamapple's recap, game's technical specs and performance sound major shit-o-la.
Weird, I just played the game for a few hours now and it has a motion blur toggle. Did DF test the game with a previous patch?In still shots sure and likely a reason why you can't disable motion blur
Did they already fix it? Would be nice if they didWeird, I just played the game for a few hours now and it has a motion blur toggle. Did DF test the game with a previous patch?
The issue is that consoles aren't running DLSS. And FSR 2.x just doesn't do well reconstructing from such a low res, especially in motion.Is this some sort of group selective ignorance thing or just the new kinda trolling?
Reconstruction is by now, a consistent and proven tech and we all should know how that stuff works. Its not 720/792/12xx whatever. It's a reconstructed 1440p with any of those internal resolutions.
Anyone coming in these threads and saying what you guys are saying like you guys are saying it, are just trolling. Because the end product doesn't in any shape or form look like a 720p game....
In the fucking video they used a 4090 with high settings and the game running a native 1440p on PC to show this fact for crying out loud.
Careful . Around here everyone just shits on remnant without having played a second of it and only watched a 480p video on their Samsung Galaxy 5 to determine it’s a butt ugly game.playing the game its a really pretty game, I'm playing on ultra and havent experienced the console version but the different environments are all vastly different and beautiful in their own rights and the image is extremely crisp. I was surprised when people were calling it ugly. It looks about as good as GOW 5 IMO, which gives credit to GOW5 considering its from 2019 but this is still a good looking game.
I see why they are getting the pro out asap because as more games come out running at these resolutions, plenty of the more enthusiastic gamers are gonna read this shit and go straight to pc.
The ps5 and series X were 100 percent just pro consoles of pro consoles.
This game isnt even using FSR.The issue is that consoles aren't running DLSS. And FSR 2.x just doesn't do well reconstructing from such a low res, especially in motion.
And even for DLSS and especially FSR 2.x, it's much more preferable to reconstruct from 1440p or at least 1080p to 4K.
Also, note they said it looks identical/on par to PC running 1440p native. While better than PC doing FSR 1440p
Careful . Around here everyone just shits on remnant without having played a second of it and only watched a 480p video on their Samsung Galaxy 5 to determine it’s a butt ugly game.
This.Have you actually played any of these games recently? FF16 for instance also supposodely drops to 720p, never once I felt the image was blurred or with artifacts. Fear mongering is so stupid, go play the games instead of being angry at pixels.
The game has no motion blur option on Series S only due to the weak GPU. All others have always had it AFAIK.Did they already fix it? Would be nice if they did