• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doesn't the RTX "5080" suspiciously look like our late RTX 4080 12GB that ended up being the 4070 Ti?

xVodevil

Member
More VRAM is definitely preferable and that's already proven by Diablo IV and Forza Motorsport in my own experience where the game ate it all up on 4070Ti 1440p with DLSS causing all sorts of problems on a card that would do just fine otherwise if not for low VRAM.
Now I'm still playing the same games regularly on 4k DLSS performance and they are fixed and fine, but I do not trust any devs at all.. I prefer to enjoy my games at launch with least possible issues like this, not months later.
 

SolidQ

Member
When RTX 5070 is 12gb

giphy.gif
 
Last edited:
Sure, but the current console don't have other RAM to fall back on and can't use the full amount of VRAM within it due to OS constraints.

But here is the thing, Sure, the PS5 Pro has 16 GB, but only 12.5 is useable of GDDR6 that runs at 560GB/s on AMD's tech, which isn't the greatest. It uses GDDR6 for both the OS and Gaming on a PCIe4 bus. In the future, I could see the PS6 having 20GB GDDR7 on PCIe 5, while having only 16GB useable which puts the 5080 in line with memory for the PS6, while still outperforming it. But by the time the PS6 arrives, we will have the 60 series which will also be at 20GB GDDR7X or GDDR8 on PCIe5 or PCie6 if it's established, but I don't expect that till the 70 series..

The 5080 on the other hand is running on 16GB of GDDR7 at 896Gb/s on desktops that also may have 16 - 128Gb of DDR5 or DDR6 (when it arrives) with super fast SSD's that can utilize Direct Storage on a PCIe5 bus if need be.

Not only this, but consoles are meant to be energy efficient and not draw a ton of power, so PS5Pro is sitting here at 200W whereas my 5080 can utilize 400W of power on it's own while my CPU uses around the same. It's not really apples to apples.

I still stand by the fact that the 4090 compared to the 3090 is night and day with the same amount of RAM, but people need to look at bandwidth, and speed. These video cards churn data in and out at a fast rate which may lead into less RAM needed to accomplish the job.

Yep, in my case for example i got a 3080 10GB Vram, i remember seeing a lot of comments about it saying it would struggle hard shortly, but in my case, im yet to find a case where the VRAM was the bottleneck in any game for me, im playing in 3440x1440p tho, not 4K.

Not saying you can't get the 3080 to struggle in certain games depending on the settings, but if you are aiming for higher framerates (60+) and use DLSS, my bottleneck has always been the power, rasterization or RT capabilities of the card instead of the VRAM.

If i had to choose between adding 20% more VRAM to the card or 10% more power to add more FPS to my games, i'd go with the power any day.

This said, without knowing the tech the series 5000s will be using etc, 12GB for a 5070 sounds a bit scary not gonna lie, but if i was looking for the 5080, 16GB look decent enough, and i'd be more worried (or curious) about what kind of technology NVidia is going to bring to the new cards instead

Considering how many game are using UE5 or intend to, and the advancements in there for Lumen+MegaLights (just recently) thats not really a bad option. For people who can't step up to an Nvidia card with good performance and 16GB+ VRAM.
But Lumen is still RT, there's a software lumen which is a low end RT and hardware Lumen which takes more resources, but the difference is huge, in Silent Hill 2 for example i found activating HW Lumen to be more worth than putting better quality reescalation or everything else to ultra.

That might be another reason why UE 5 on consoles is so bad so far.
 

kevboard

Member
But Lumen is still RT, there's a software lumen which is a low end RT and hardware Lumen which takes more resources, but the difference is huge, in Silent Hill 2 for example i found activating HW Lumen to be more worth than putting better quality reescalation or everything else to ultra.

That might be another reason why UE 5 on consoles is so bad so far.

I don't even understand the reason for Software Lumen other than tanking performance. it's so shit... it works ok in Fortnite I guess, because it's a less realistic setting... but in anything that goes for realism it just looks awful.

thankfully on my 3060ti I can get PS5 esque performance at PS5 esque internal resolution with hardware Lumen. so on RTX cards it's thankfully not much of an issue...

...unless the Devs don't even give you an option to use hardware lumen of course 😑 like in Robocop for example
 
Last edited:

Bry0

Member
Yes there is

- GB205 chip has 6400 cores MAX (Full Chip)
- Original 4070 has 5888 cores (AD104)
- 4070 SUPER has 7168 (AD104)
- 4070 Ti has 7680 (Full AD104)
Margins baby, all hail the shareholders. Expensive node so we gotta shrink that die area and use the bare minimum vram. Don’t worry Jensen will be like “wow gddr7 bandwidth amazing!”

But if you want big chips and vram we have a $2500 product just for you!
 

Bry0

Member
More VRAM is definitely preferable and that's already proven by Diablo IV and Forza Motorsport in my own experience where the game ate it all up on 4070Ti 1440p with DLSS causing all sorts of problems on a card that would do just fine otherwise if not for low VRAM.
Now I'm still playing the same games regularly on 4k DLSS performance and they are fixed and fine, but I do not trust any devs at all.. I prefer to enjoy my games at launch with least possible issues like this, not months later.
Yeah I was having vram issues on my 4080 at launch for Diablo 4 and some muddy texture issues on RE4R at launch at 4K. Needless to say I was pretty upset by that… I think it’s largely fixed now but yeah I’m with you there.
 
Top Bottom