• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How long can PC gaming last with current prices on graphics cards?

Xdrive05

Member
We just have to hope these old cards can limp along another year or so. My nearly 8 year old GTX 980 4GB is still hanging in there. Solutions like Nvidia Image Scaling definitely can help is through the drought.

Since you can't control the silicon market, then make the most of what you CAN control!

Buy a long-ass HDMI 2.1 cable and run it to your livingroom TV. Even if you can only run games at 1080P, the "comfy couch" experience is a nice way to mix things up and make it feel new again.

The Witcher 3 on my 65" Sony X900H at 1080p120hz is like a whole other experience compared to my desk experience.

Seriously, try it!
 

Buggy Loop

Member
I tried GeForce Now priority from a friend’s code before I managed to score an ampere card.

I personally think that, if online competitive games aren’t your thing and you mainly play single player games, it’s legit a nice alternative until GPU situation calms down.

Hell, the Nvidia shield that can stream GeForce now at 4K+HDR is legit an underdog console alternative, although with a bit more latency than PC app (but not 4K).

If I had not scored an ampere card, I would be using this 100%
 
Last edited:

MikeM

Member
I’ve essentially thrown in the towel. I priced out a build 12600k/6700xt and it was gonna run $2800 ($1200+ tax just for the GPU) from Newegg Canada. Yes, it would be better than my PS5 and Series X, but not ~$1600 worth of quality better than the consoles.
Just gonna sit it out and wait.
 

//DEVIL//

Member
wLkoBWm.jpg


J4fdxEO.jpg

I5 12600k
Dell 3090 ( got it for 2200$ Cad/ 1750$ US so won’t complain - surprsingly awesome in terms of temp and noise )
3tb nvme
Small ITX form factor almost the size of Xbox series x . ( honestly this case is probably the best case ever for ITX build since it’s mesh all sides and the panels on all sides are easy to remove )
32gig ddr5 5600 speed .


Paired with Dell ultra wide 3840x1440p 34 inch monitor 120 frames. I am good till at least 2023 if I can upgrade to 4000 series if they are much better ( or will stick with this one if it’s not )

I do not have to worry about video cards or pc parts for the rest of the year . I am done with the upgrades for now and my pc golden for whatever unreal engine 5 can bring at 2k I am 100% getting st least 60 frames ultra settings
 
Last edited:

Nocturno999

Member
I love how disingeneous you sound while saying that based on stats, PC performance is on par with last gen consoles (which is complete bullshit).

Under that logic we can say that current consoles are on par with Wii U and half era since more homes have a Switch over a PS5/Series X
 

gatti-man

Member
Yeah, because the majority of gpus are in the hands of players and not miners\scalpers...
Yes they are. Well the majority of cards are in the hands of gamers and miners I have no way of validating how many miner vs gamers have cards. All kinds of at home hobbies exploded lately. Look up the cost of pinball machines for example.
 

01011001

Banned
til next generation of consoles since current is only on par with what a 2060 super.

yeah even with a 10 series card you'll be fine for a while. a 1070ti or 1080 being on par with most PS5/Series X versions of many games. and those cards released in 2017

I mean here is a GTX 1070 (not Ti) running Guardians of the Galaxy at higher settings than the PS5 performance mode (higher LOD and shadow settings)
 
Last edited:
I wouldn't read too much into the cross-gen phase. This brings back "750ti/960 being good enough" memories.
960 4gb was good enough to beat base ps4 every time. The 2gb model, no.

Same with 1050ti.

As someone with an rtx 3060, it has the vram (12gb) and specs to perform on pace with base ps5 for the whole gen, It’s a better gpu overall than ps5 due to rt and tensor cores but ps5 has higher clocks and is a closed box system.

As long as it gets proper driver support, it’s good.
 
960 4gb was good enough to beat base ps4 every time. The 2gb model, no.

Same with 1050ti.

As someone with an rtx 3060, it has the vram (12gb) and specs to perform on pace with base ps5 for the whole gen, It’s a better gpu overall than ps5 due to rt and tensor cores but ps5 has higher clocks and is a closed box system.

As long as it gets proper driver support, it’s good.

Meryl Streep Doubt GIF


RDR2 looks much better on PS4 than PC low settings (even the X1 looks better since PC low is an absolute muddy mess), and PC low is exactly where you'd be on a 960. Most modern titles are the same, games may run with low settings but visual fidelity will be much worse.

For reference:



 
Last edited:

01011001

Banned
Meryl Streep Doubt GIF


RDR2 looks much better on PS4 than PC low settings (even the X1 looks better since PC low is an absolute muddy mess), and PC low is exactly where you'd be on a 960. Most modern titles are the same, games may run with low settings but visual fidelity will be much worse.

it completely depends on the game. here is Doom Eternal running at way higher settings than PS4 on a GTX 960 4gb



this is mostly ultra, textures on high and dynamic 1080p with similar lows as the PS4's dynamic res
if you match PS4 settings the resolution would most likely hold way higher
 
Last edited:
Meryl Streep Doubt GIF


RDR2 looks much better on PS4 than PC low settings (even the X1 looks better since PC low is an absolute muddy mess), and PC low is exactly where you'd be on a 960. Most modern titles are the same, games may run with low settings but visual fidelity will be much worse.

For reference:




That guy is using the wrong settings. look at the vram usage ; he is using well below the 4gb buffer.

Actually you can run the game on medium settings with high textures on the 4gb 960. Here’s a video of medium 1080p settings the guy is still using medium textures because it is a 2gb model. you can use high textures or even ultra on 4gb version.

Skip to the 8:30 mark -



base ps4 is a low bar ; 1080p 30fps with dips.

Maxwell 960 4gb or Pascal 1050ti 4gb will beat a ps4 easily.
 
Last edited:
That guy is using the wrong settings. look at the vram usage ; he is using well below the 4gb buffer.

Actually you can run the game on medium settings with high textures on the 4gb 960. Here’s a video of medium 1080p settings the guy is still using medium textures because it is a 2gb model. you can use high textures or even ultra on 4gb version.

Skip to the 8:30 mark -



base ps4 is a low bar ; 1080p 30fps with dips.

Maxwell 960 4gb or Pascal 1050ti 4gb will beat a ps4 easily.


I don't know, the game looks equally terrible in the video you put up. LOL
 
Last edited:
I don't know, the game looks equally terrible in the video you put up. LOL
What is worse than ps4 to your eyes? Perhaps it’s textures and ps4 uses high setting for textures. Base ps4 does not use ultra textures. As I said you can go higher than medium textures on a 4gb card at 1080p, maybe ultra. The video at the time mark I gave uses medium.

As long as you have vram the higher textures will cause no performance hit so ps4 will lose there as well
 
What is worse than ps4 to your eyes? Perhaps it’s textures and ps4 uses high setting for textures. Base ps4 does not use ultra textures. As I said you can go higher than medium textures on a 4gb card at 1080p, maybe ultra. The video at the time mark I gave uses medium.

As long as you have vram the higher textures will cause no performance hit so ps4 will lose there as well

I think the consoles get a weird mix of settings, with some things lower than low but others beyond high. They do a good job on PS4 of keeping the textures sharp close the player where you are focusing on things the most.
 
I think the consoles get a weird mix of settings, with some things lower than low but others beyond high. They do a good job on PS4 of keeping the textures sharp close the player where you are focusing on things the most.
AFAIK ps4 pro uses ultra textures but in a buggy manner ; only x1x has ultra textures with full detail. Base ps4 may use high textures and Pc medium and low are optimizations for low power pc’s. Ps4 may be medium textures but i’m just guessing, i’d need side by side comparisons.

Either way on gtx 960 4gb you can get the ultra textures and 16x af to really push past ps4.

Hypothetically with a good cpu and the 960 I would probably use a mix of low and medium settings and then the ultra textures would definitely be better than ps4 visually but get a bit more performance.

I had a 1050ti and a good cpu and ran witcher 3 at low/medium settings but with ultra textures at 60fps which looked noticeably better than ps4 which was only at 30fps to boot.
 

Dream-Knife

Banned
wLkoBWm.jpg


J4fdxEO.jpg

I5 12600k
Dell 3090 ( got it for 2200$ Cad/ 1750$ US so won’t complain - surprsingly awesome in terms of temp and noise )
3tb nvme
Small ITX form factor almost the size of Xbox series x . ( honestly this case is probably the best case ever for ITX build since it’s mesh all sides and the panels on all sides are easy to remove )
32gig ddr5 5600 speed .


Paired with Dell ultra wide 3840x1440p 34 inch monitor 120 frames. I am good till at least 2023 if I can upgrade to 4000 series if they are much better ( or will stick with this one if it’s not )

I do not have to worry about video cards or pc parts for the rest of the year . I am done with the upgrades for now and my pc golden for whatever unreal engine 5 can bring at 2k I am 100% getting st least 60 frames ultra settings
That's a neat case, what is it?

Why did you go DDR5 especially if you just upgrade every year?
 

FStubbs

Member
yeah even with a 10 series card you'll be fine for a while. a 1070ti or 1080 being on par with most PS5/Series X versions of many games. and those cards released in 2017

I mean here is a GTX 1070 (not Ti) running Guardians of the Galaxy at higher settings than the PS5 performance mode (higher LOD and shadow settings)

I saw a recent game that had a recommended card of a 1080. Can't remember what it was though.
 

Dream-Knife

Banned
Meryl Streep Doubt GIF


RDR2 looks much better on PS4 than PC low settings (even the X1 looks better since PC low is an absolute muddy mess), and PC low is exactly where you'd be on a 960. Most modern titles are the same, games may run with low settings but visual fidelity will be much worse.

For reference:




Ultra textures at 1080p only use 3gb of VRAM.


960 doesn't quite have the horsepower to play at 30fps, but neither does the base PS4.

I saw a recent game that had a recommended card of a 1080. Can't remember what it was though.
Far Cry 6 at 60fps. Frankly, most major PC games that have released in the last year have had poor optimization, particularly the AMD sponsored titles for some reason...
 
Last edited:

Kenpachii

Member
Meryl Streep Doubt GIF


RDR2 looks much better on PS4 than PC low settings (even the X1 looks better since PC low is an absolute muddy mess), and PC low is exactly where you'd be on a 960. Most modern titles are the same, games may run with low settings but visual fidelity will be much worse.

For reference:






1) Nobody cares about RDR its shit optimized on PC, 3080 can't even hit 60 fps at ultra on 1080p, countless issue's iwth that game + online department is a shit show entirely.
2) PS4 runs it at lower settings then a 960 can push out. If that 960 can't push the same or higher performance, the game is badly optimized simple as that
3) Looking at gazillion other titles that anybody could link where the 960 outperforms PS4 in every way.

This hyperfocusing on 1 title is beyond idiotic

960 is a 2,4 tflop nvidia gpu, it will outperform the PS4 easily and with that 4gb of memory it will also boot better quality textures up.
 
Last edited:

ryzen1

Member
At the moment I favor playing on the PS5 instead of my PC, due to the lack of performance.
I'd really like to spend 600€ on a new RTX3070. But I'm not ready to pay the scalper prices.

As long as that's the case, I'll continue to play new games on the PS5 for the time being
 

GreatnessRD

Member
wLkoBWm.jpg


J4fdxEO.jpg

I5 12600k
Dell 3090 ( got it for 2200$ Cad/ 1750$ US so won’t complain - surprsingly awesome in terms of temp and noise )
3tb nvme
Small ITX form factor almost the size of Xbox series x . ( honestly this case is probably the best case ever for ITX build since it’s mesh all sides and the panels on all sides are easy to remove )
32gig ddr5 5600 speed .


Paired with Dell ultra wide 3840x1440p 34 inch monitor 120 frames. I am good till at least 2023 if I can upgrade to 4000 series if they are much better ( or will stick with this one if it’s not )

I do not have to worry about video cards or pc parts for the rest of the year . I am done with the upgrades for now and my pc golden for whatever unreal engine 5 can bring at 2k I am 100% getting st least 60 frames ultra settings
That's really a nice case. (And spec sheet you have there) I'm torn between this case and the Lian-Li Q58 as I want to build a small form factor PC for my living room soon. Or HTPC for the initiated. Those SL140's you got in the case really make the front panel 'pop'. *Firm handshake*
 
1) Nobody cares about RDR its shit optimized on PC, 3080 can't even hit 60 fps at ultra on 1080p, countless issue's iwth that game + online department is a shit show entirely.
2) PS4 runs it at lower settings then a 960 can push out. If that 960 can't push the same or higher performance, the game is badly optimized simple as that
3) Looking at gazillion other titles that anybody could link where the 960 outperforms PS4 in every way.

This hyperfocusing on 1 title is beyond idiotic

960 is a 2,4 tflop nvidia gpu, it will outperform the PS4 easily and with that 4gb of memory it will also boot better quality textures up.
I wouldn’t go as far to say the 2gb version is better than ps4 as it’s just severely limited in vram (it’s better in certain older titles but will have worse textures and stutter in newer stuff) but the 4gb version definitely is. Ditto 1050 ti 4gb. ps4 > 2gb model.

Even with how rdr2 is very unoptimized I know the 4gb 960 will still produce better results when tweaked.

Maxwell is definitely past its expiration date at this point but it was so great last gen for a long time. Pascal is hanging in there a bit better due to driver support but it’s almost done as well.
 
Last edited:
yeah even with a 10 series card you'll be fine for a while. a 1070ti or 1080 being on par with most PS5/Series X versions of many games. and those cards released in 2017
Bolded underlined: yup, can confirm.

On my build, the 1070Ti can run any game from last gen and prior at 1440p with ease. Often high (in many cases ultra) settings.

I was nervous before starting Red Dead 2. Initially I ran it at 1440p (in the medium "balanced" mode) and I was getting 40-45 ish fps, which wasn't bad. I upped the visual settings to high, and I was getting 30 fps pretty much all the time. Don't know if it's the game or if I don't have much experience with tweaking settings, but the 30fps felt choppy as fuck. So I then brought the resolution down to 1080p, and cranked up most of the visual settings to high (and many of the less taxing ones to ultra).

So I found out that a 1070Ti can do Red Dead 2 at 1080p in very high settings and give you... Between 60 and 70 fps at all times. It's fucking glorious. The day my 1070Ti kicks the bucket, I'm gonna cry 😂
 

//DEVIL//

Member
That's really a nice case. (And spec sheet you have there) I'm torn between this case and the Lian-Li Q58 as I want to build a small form factor PC for my living room soon. Or HTPC for the initiated. Those SL140's you got in the case really make the front panel 'pop'. *Firm handshake*
Good eyes for you to notice they are 140s and not 120s lol.

Yeah I replaced the MSi Mag 280 fans with these. They are more classy . I hate it when the whole fan is rgb . Makes the build feel like a toy.

The case you mentioned is super nice too. But I kinda wanted something that has the shape of Xbox . Didn’t want something like a small / mini regular case .

Besides, beggars can’t be choosers lol. because I got this case used for 100$ which include pcie 4 riser . Otherwise I would have at least went with white case to match my alien monitor lol
 
Last edited:

Dream-Knife

Banned
1) Nobody cares about RDR its shit optimized on PC, 3080 can't even hit 60 fps at ultra on 1080p, countless issue's iwth that game + online department is a shit show entirely.
2) PS4 runs it at lower settings then a 960 can push out. If that 960 can't push the same or higher performance, the game is badly optimized simple as that
3) Looking at gazillion other titles that anybody could link where the 960 outperforms PS4 in every way.

This hyperfocusing on 1 title is beyond idiotic

960 is a 2,4 tflop nvidia gpu, it will outperform the PS4 easily and with that 4gb of memory it will also boot better quality textures up.
What's the split on PS4 ram? Does it use up that 6-7gb mostly on CPU? I would have assumed a PS4 could hang with a 4gb card, aside from actual performance.

Never played rdr2, but what is your utilization like?
 

Kenpachii

Member
I wouldn’t go as far to say the 2gb version is better than ps4 as it’s just severely limited in vram (it’s better in certain older titles but will have worse textures and stutter in newer stuff) but the 4gb version definitely is. Ditto 1050 ti 4gb. ps4 > 2gb model.

Even with how rdr2 is very unoptimized I know the 4gb 960 will still produce better results when tweaked.

Maxwell is definitely past its expiration date at this point but it was so great last gen for a long time. Pascal is hanging in there a bit better due to driver support but it’s almost done as well.

2GB is the bare minimum in 2014 already, i nkow this because my 580 1,5gb didn't run ac unity / watch dogs for shit. So anybody buying a card in 2014 with 2gb v-ram expecting to last it for a long period of time just at the start of a generation is asking to have issue's. This is why 4gb was the focus and 4gb cards already existed all the way from the 600 series even while they where not common. It was also why 770 was the best solution to go for and the 780 was kinda shit value with its 3gb memory pool which resulted in people buying 970's or 290's.

RDR2 uses higher base settings then consoles, consoles straight up uses lower then low settings. U can lower all settings to potato level in the ini files, what u see in the setting menu isn't the end all. As the game runs on a low level api, overhead isn't much there either so comparing gpu statistics against eachother is a good indication on what a card can do unless optimization of a game is terrible.

Maxwell is fully supported, while performance it starts to showcase it ages in newer titles, its still very much relevant as its a 1050(960), 1060(980), 1070(980ti) tier and games still optimize for it perfectly fine. And plays 99% of all pc games without issue's.

But yea i agree maxwell is on its last legs.

What's the split on PS4 ram? Does it use up that 6-7gb mostly on CPU? I would have assumed a PS4 could hang with a 4gb card, aside from actual performance.

Never played rdr2, but what is your utilization like?

I think it has 8gb of memory with 3-4gb dedicated v-ram for games, i believe spiderman budgeted 3,5gb. Early in the generation we saw 2gb allocations because PC hardware in the high end segment didn't had much more as 2gb to allocate. V-ram would go up pretty severely on PC it was already a problem when skyrim came around.

No clue anymore about it, i don't have the game installed and don't have the space to reinstall it currently so can't really check out. Only benchmark i got on my drive is 3440x1440 benchmark and that mighty stuggled with scaled down settings + dlss on ( aka no msaa )to hit a stable 60 fps.

The game is a GPU resource hog.
 
Last edited:
2GB is the bare minimum in 2014 already, i nkow this because my 580 1,5gb didn't run ac unity / watch dogs for shit. So anybody buying a card in 2014 with 2gb v-ram expecting to last it for a long period of time just at the start of a generation is asking to have issue's. This is why 4gb was the focus and 4gb cards already existed all the way from the 600 series even while they where not common. It was also why 770 was the best solution to go for and the 780 was kinda shit value with its 3gb memory pool which resulted in people buying 970's or 290's.

RDR2 uses higher base settings then consoles, consoles straight up uses lower then low settings. U can lower all settings to potato level in the ini files, what u see in the setting menu isn't the end all. As the game runs on a low level api, overhead isn't much there either so comparing gpu statistics against eachother is a good indication on what a card can do unless optimization of a game is terrible.

Maxwell is fully supported, while performance it starts to showcase it ages in newer titles, its still very much relevant as its a 1050(960), 1060(980), 1070(980ti) tier and games still optimize for it perfectly fine. And plays 99% of all pc games without issue's.

But yea i agree maxwell is on its last legs.



I think it has 8gb of memory with 3-4gb dedicated v-ram for games, i believe spiderman budgeted 3,5gb. Early in the generation we saw 2gb allocations because PC hardware in the high end segment didn't had much more as 2gb to allocate. V-ram would go up pretty severely on PC it was already a problem when skyrim came around.

No clue anymore about it, i don't have the game installed and don't have the space to reinstall it currently so can't really check out. Only benchmark i got on my drive is 3440x1440 benchmark and that mighty stuggled with scaled down settings + dlss on ( aka no msaa )to hit a stable 60 fps.

The game is a GPU resource hog.
There’s no way ps4 textures are lower than low but I can believe there are other settings lower than low for rd2.

970 is a card that just refuses to be irrelevant I mean wow that thing was a king tier purchase when it came out, you got your money’s worth on that one. 3.5 gb vram be damned it still can run a lot of games.
 

Kenpachii

Member
There’s no way ps4 textures are lower than low but I can believe there are other settings lower than low for rd2.

970 is a card that just refuses to be irrelevant I mean wow that thing was a king tier purchase when it came out, you got your money’s worth on that one. 3.5 gb vram be damned it still can run a lot of games.

Lower then low is not about textures its about settings, even the xbox one X runs settings lower then low on PC, u can watch the DF article.

Yea the 970 was a god tier product really, i replaced it for emulation 2 years ago, my buddy replaced his 3 months ago when his pc died and bought a 3070 for way to much money otherwise he would probably still sit on it. That card was legendary.
 

rofif

Can’t Git Gud
Average selling price of a ps5 on ebay is down to $650-700 FYI. This narrative doesn't really hold water any more.
In Poland here, PS5 is freely available since Christmas in media markt and similar.
BUT ONLY in fucking bundles. Still not bad. 650 euro bundle with ps5 physical, 2 games and year of ps+.
 

GreatnessRD

Member
Good eyes for you to notice they are 140s and not 120s lol.

Yeah I replaced the MSi Mag 280 fans with these. They are more classy . I hate it when the whole fan is rgb . Makes the build feel like a toy.

The case you mentioned is super nice too. But I kinda wanted something that has the shape of Xbox . Didn’t want something like a small / mini regular case .

Besides, beggars can’t be choosers lol. because I got this case used for 100$ which include pcie 4 riser . Otherwise I would have at least went with white case to match my alien monitor lol
Yeah, I got 4x 140s and 3x 120s in my current case (Be quiet Silent base 802, lol) Such a clean fan. And its crazy because I didn't used to like the ring fans. I was with the circle, colorful toy fans, but them SL's called to me, lol.

And You got a heck of a deal for $100 with a gen4 riser cable. I ain't mad atcha. Enjoy.
 

DragonNCM

Member
Games will scale better, they can't afford to make games only for new GPU's.
This all generation of console games will reflect PC games also. Developers are not crazy to louse money for some shiny new GPU futures.
Next 5 years we will have same graphic.
 

Griffon

Member
Not a single popular game will require the new GPU features. As long as the high end prices are unreasonable for normal people, devs will adapt and scale their games accordingly.
 
Last edited:

Skifi28

Member
Fingers crossed my 1060 can last a few more years until things get a little better, otherwise not much pc gaming for me with my spare 8800gt.
 

Petopia

Banned
wLkoBWm.jpg


J4fdxEO.jpg

I5 12600k
Dell 3090 ( got it for 2200$ Cad/ 1750$ US so won’t complain - surprsingly awesome in terms of temp and noise )
3tb nvme
Small ITX form factor almost the size of Xbox series x . ( honestly this case is probably the best case ever for ITX build since it’s mesh all sides and the panels on all sides are easy to remove )
32gig ddr5 5600 speed .


Paired with Dell ultra wide 3840x1440p 34 inch monitor 120 frames. I am good till at least 2023 if I can upgrade to 4000 series if they are much better ( or will stick with this one if it’s not )

I do not have to worry about video cards or pc parts for the rest of the year . I am done with the upgrades for now and my pc golden for whatever unreal engine 5 can bring at 2k I am 100% getting st least 60 frames ultra settings
Rocking them lianli uni 140 mm fans huh.
 

Petopia

Banned
If yall think its expensive might as well pay the piper since its gonna be awhile til the prices settle down again.
 
1) Nobody cares about RDR its shit optimized on PC, 3080 can't even hit 60 fps at ultra on 1080p, countless issue's iwth that game + online department is a shit show entirely.
2) PS4 runs it at lower settings then a 960 can push out. If that 960 can't push the same or higher performance, the game is badly optimized simple as that
3) Looking at gazillion other titles that anybody could link where the 960 outperforms PS4 in every way.

This hyperfocusing on 1 title is beyond idiotic

960 is a 2,4 tflop nvidia gpu, it will outperform the PS4 easily and with that 4gb of memory it will also boot better quality textures up.

I think you overlook the amount of optimization that goes into the console builds, this tends to let the consoles punch above their weight very well. There is a lot more than just the textures that is different between PS4 and PC Low (and just bumping textures up on PC doesn't fix that). The foliage and ground cover look particularly bad on PC low, objects are also flatter looking in most cases.
 

Vick

Member
Personally, i'm kind of fucked. Unless things change relatively soon, to play most games at their best i would be forced to buy a Series X in addition to a PS5, as upgrading is simply out of question for the time being.
Because even if i was willing to pay for a 3080 it would be more of a short-term investment as it would certainly not be sufficient to play real Next-Gen games (and not the cross-gen stuff we're surrounded with) at Ultra - 4K - 60fps, which are the only reason i would spend all these money to begin with.

I was looking at the GoW situation and we're sort of there already at Ultra 4K 60fps.. and Ultra Shadows are the only real super tangible upgrade over the already 60fps, CB4K GoW you can play on PS5.

What is worse than ps4 to your eyes? Perhaps it’s textures and ps4 uses high setting for textures. Base ps4 does not use ultra textures.
Base PS4 does use Ultra Textures, and look much better than it does in that video.

1) Nobody cares about RDR its shit optimized on PC, 3080 can't even hit 60 fps at ultra on 1080p, countless issue's iwth that game + online department is a shit show entirely.
It's still by far the best, most polished, more impressive, cohesive and consistent open world game. Nothing comes close to RDR2 v1.00.

Only real "Lower than Low", other than water physics, are reflections but it's perfectly clear why looking at how absolutely awful they look maxed out on PC.

On Consoles SSR only purpose was to enrich the perfectly placed cubemaps (tons of them, everywere). Their use it's extremely limited, and tastefully added as you never really notice them. In short reflections on Consoles are working as they should, they never draw attention to them, and never appear/disapper all of a sudden because all they do is add (soft) detail to the cubemaps, or give some life to window's glass.

On PC higher reflections settings are a fucking mess. They look super blocky and ugly and glitchy because they were never meant to be seen that way, and simply destroy any kind of immersion whenever they are on screen.

2) PS4 runs it at lower settings then a 960 can push out. If that 960 can't push the same or higher performance, the game is badly optimized simple as that
Or, more likely, the game was optimazed to the max for the closed systems it released on, and therefore can't simply be translated in those terms.

You said nobody cares about RDR since it's shit optimized on PC, then please stop spreading asinine things like the game would look better on a 960 then it does on a PS4.

On Consoles v.1.00 uses Ultra Textures, Ultra Geometry, Ultra Draw-Distances, better Ambient Occlusion than any setting on PC, additional light sources, better facial hair and so on.

I'm really not sure what the hell is up with the AO in v.100, but it's by far the best AO i've ever seen that's not RT.
Every version after that just implements a regular kind of SSAO, totally missing the kind of pre-rendered look AO have in version 1.00, and this includes the PC settings.
It's a whole different beast, no way around it, and while there aren't many comparisons using v.1.00 out there unfortunately, an example can be seen on Arthur's face here for example.

efceW8m.gif


And resolution aside the most notable differences between Pro and Base are these ones:

Pro:
<mirrorQuality>kSettingLevel_High</mirrorQuality>
<shadowSoftShadows>kSettingLevel_Medium</shadowSoftShadows>
<particleLightingQuality>kSettingLevel_High</particleLightingQuality>
<volumetricsRaymarchQuality>kSettingLevel_High</volumetricsRaymarchQuality>

Base:
<mirrorQuality>kSettingLevel_Medium</mirrorQuality>
<shadowSoftShadows>kSettingLevel_High</shadowSoftShadows>
<particleLightingQuality>kSettingLevel_Medium</particleLightingQuality>
<volumetricsRaymarchQuality>kSettingLevel_Medium</volumetricsRaymarchQuality>

Of course PC version is still recommended for 60fps (altough not perfect, as some animations are still 30fps), water physics, long shadows, foliage resolution, far object's texture resolution.. but if there was a way to play 1.00 on Series X at 60fps, it would be without a doubt the best version of the game.
 
Vick Vick Digital foundry John specifically said the x1x version and ps4 pro version had higher res textures than the base consoles which means ps4 does not use ultra textures.

It is weird how other aspects of the visuals got downgraded after the 1.0 disc release…
 
Top Bottom