• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Who got a 5090-5080? post in this thread

Yeah, the current humongous die size doesn't help, it's such a small upgrade compared to the 4xxx series. And on top more consumptions and more heat due to that. Never been less excited for a GPU launch in many years. But they've nailed it on the software side at least, thanks to the new upscaling models.

The 5080 definitely reminds me of the "unlaunched" 12GB 4080 from a couple of years ago.
The RTX5080 certainly reminds me of the 'infamous' 12GB 4080 (aka 4070ti 12GB), but this time around Nvidia have not announced a 20% better card compared to the RTX5080, so people will not pressure them to change the names. However, Nvidia may be planning to release a true RTX5080 (with more shader cores and VRAM) in 8-12 months time and they will call it either 5080ti or 5080 super.
 

Hoddi

Member
The comparison with the RTX3070 is very bad. This card was released at the same time as the PS5 and had less VRAM than that console. It was a recipe for catastrophe 😀. The RTX3070 was VRAM limited from day zero.

The RTX5080 should have more than 16GB's considering it's price (VRAM is relatively cheap), because when PS6 ports start releasing on PC even 16GB may be not enough to match the PS6 settings, but right now 16GB VRAM is still plenty. Most games use between 9-12GB VRAM even at 4K. There are very few PT games that can alocate more than 16GB VRAM (PT increase VRAM a lot) at 4K native, but not even the RTX5090 with 32GB VRAM has the power to run Cyberpunk or Indiana Jones with PT at 4K native with 60fps. You need to use DLSS in these extremely demanding PT games regardless of what GPU you have and then even 16GB VRAM is enough.

With the recent uptade to Alan Wake 2 mega geometry decreased VRAM allocation by about 1.5GB. I see around 11-12GB VRAM allocation (and that's not even usage) at 4K DLSS balance with PT, but let's pretend that 16GB VRAM is a real problem.
There are 5160x2160 monitors coming out this year where this card is already out of memory in games like Rift Apart.

DLAA
DLSS
DLSS + framegen

The last shot clearly shows the card out of memory. The wattage collapses while PCIe bus load goes through the roof.
 
There are 5160x2160 monitors coming out this year where this card is already out of memory in games like Rift Apart.

DLAA
DLSS
DLSS + framegen

The last shot clearly shows the card out of memory. The wattage collapses while PCIe bus load goes through the roof.
There are even 8K monitors, and games can use up to 30GB of VRAM at this resolution, so if you are that concerned about VRAM usage, the RTX5090 is the only card for you.

gW6lgf3.jpeg


16GB of VRAM is enough for popular resolutions, but at 5K and above, more VRAM would definitely help.

Personally I like to look at things from practical point of view. VRAM is only a real problem for me when it REALLY affects my experience. For example if textures look extremely blurry, or the game stutters even below console settings, then I can say that VRAM is really affecting my experience.

I played R&C Rift Apart at 4K DLSSQ + FG + RT and I saw around 14GB VRAM alocation. I think with updated FG dll VRAM requirements should be even lower now. Maybe at 5K the game would finally hit VRAM limits at maxed out settings, but then I could adjust the settings to stay within the VRAM budget. Often there's no noticeable difference between very high and high texture streaming pool, so I thing these tweaks wouldnt even affect my experience. Sometimes I even turn off certain settings if they make the game look worse. For example, in Ratchet And Clank, RT shadows look worse to me compared to raster shadows.

0XoGYUc.jpeg
khkA0fV.jpeg


The RTX5080 should offer more VRAM, becasue people who will buy this $1000-1200 card will probably want to use it for couple of years and it's always better to have more VRAM than worry about it. 16GB VRAM can be a REAL problem when PS6 ports starts coming out. We will be playing PS5 ports for at least the next 3 years, and I think 16GB might still be enough to play the first wave of PS6 ports with pretty good texture settings. IMO people who bought 16GB GPUs still have about 5 years of very good gaming experience on such cards, and for someone not willing to compromise (using DLSS or lower settings) than the RTX5090 is the only choice.
 
Last edited:

moogman

Member
The RTX5080 certainly reminds me of the 'infamous' 12GB 4080 (aka 4070ti 12GB), but this time around Nvidia have not announced a 20% better card compared to the RTX5080, so people will not pressure them to change the names. However, Nvidia may be planning to release a true RTX5080 (with more shader cores and VRAM) in 8-12 months time and they will call it either 5080ti or 5080 super.
That's a certainty considering how weirdly underclocked the card is. The only reason for doing that is to leave space for something around or just over 4090 performance in the line-up.
 
Last edited:

analog_future

Resident Crybaby
The RTX5080 certainly reminds me of the 'infamous' 12GB 4080 (aka 4070ti 12GB), but this time around Nvidia have not announced a 20% better card compared to the RTX5080, so people will not pressure them to change the names. However, Nvidia may be planning to release a true RTX5080 (with more shader cores and VRAM) in 8-12 months time and they will call it either 5080ti or 5080 super.

This is 100% it. 5080Ti w/ same performance and VRAM as the 4090.
 
Just fyi if anyone is looking for prebuilt with 5080s or 5090s Best Buy still has them coming in and out of as of today as had 2 friends get PCs just today with one being a 5090 prebuilt
 

Hoddi

Member
There are even 8K monitors, and games can use up to 30GB of VRAM at this resolution, so if you are that concerned about VRAM usage, the RTX5090 is the only card for you.
16GB of VRAM is enough for popular resolutions, but at 5K and above, more VRAM would definitely help.
5160x2160 is neither the same as 5k or 8k. It's a 4k monitor with an ultrawide aspect ratio (aka 'ultrawide 2160p') and I don't think it's crazy to expect a bleeding edge $1000-$1400 GPU to have enough memory for that.

Brushing these issues away is exactly why nvidia always gets away with these antics. This isn't the first or second or third time this has happened.
 
5160x2160 is neither the same as 5k or 8k. It's a 4k monitor with an ultrawide aspect ratio (aka 'ultrawide 2160p') and I don't think it's crazy to expect a bleeding edge $1000-$1400 GPU to have enough memory for that.

Brushing these issues away is exactly why nvidia always gets away with these antics. This isn't the first or second or third time this has h
Dude, I've been gaming on PC since 1999 and apart from TITAN cards, every other high-end GPU was VRAM limited in some games and especially at extremely high resolutions. Nvidia build RTX5090 for people like you, who have such extremely high expectations. If you want to play games at extremely high resolutions and are not willing to compromise (adjust some settings), you will have to pay a premium, simple as that.
 
Last edited:

StereoVsn

Gold Member
A 5080 was worth the buy if they had used a 384bit bus and 24GB of VRAM. The way it is now, definitely not worth it. I'd wait some months for nvidia to fill the enormous gap between 5080 and 5090.
The issue is that this is a full GB203 chip. So we may get more VRAM with upcoming 3Gb Samsung chips but unless Nvidia is willing to produce a cutdown GB202 we won’t get much more performance.
 

MAX PAYMENT

Member
I thought I'd look into getting a used 4090 due to the availability of the 5090 and that the performance difference isn't particularly large... then I saw used 4090s selling for $2,600+.

Wtf is going on?
 

Ulysses 31

Member
I thought I'd look into getting a used 4090 due to the availability of the 5090 and that the performance difference isn't particularly large... then I saw used 4090s selling for $2,600+.

Wtf is going on?
Was it the Founders Edition? Those seems to have gone up more in value than AIB ones. :messenger_winking_tongue:
 
There are 5160x2160 monitors coming out this year where this card is already out of memory in games like Rift Apart.

DLAA
DLSS
DLSS + framegen

The last shot clearly shows the card out of memory. The wattage collapses while PCIe bus load goes through the roof.

Wait, really? Which monitors? I'd love to upgrade my G9 OLED to a newer version with that resolution.

Assuming I can get my hands on a 5090 that is.
 

peish

Member
Wait, really? Which monitors? I'd love to upgrade my G9 OLED to a newer version with that resolution.

Assuming I can get my hands on a 5090 that is.

Got this on preorder for April, cant risk another rtx5000 case

 

MarkyG

Member
My 4090 is more than catering to my needs until the next offering. Zero interest in the 5000 series.
 
My 5090 which I was told was supposed to be delivered this week, I've now been told they don't know when it'll be back in stock.

Motherfuckers you took my money 4 days ago. Fuck I'm so annoyed.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Very neat, but this is my absolute fabourite.


It looks nice but I’d be EXTREMELY uncomfortable blowing all that hot GPU air onto the motherboard. The 5090 is one hot devil. The Fractal Ridge basically gives the air free pass through and no effect on other components. I’d be OK doing that with a 5080, but no way on a 5090.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
My 4090 is more than catering to my needs until the next offering. Zero interest in the 5000 series.
If a less than $2300 MSRP model falls into my lap, then sure I’ll upgrade since my 4090 will pay off most of it, if demand remains high.

That’s a big if, but my 4090 is great.
 

Hohenheim

Member
My 5090 which I was told was supposed to be delivered this week, I've now been told they don't know when it'll be back in stock.

Motherfuckers you took my money 4 days ago. Fuck I'm so annoyed.
That sucks! Which card was it? I'm kinda scared that this will happen to me too.
It says it will arrive 13th february, but i'm not holding my breath.
Good thing is that i've seen a few folks been getting the same card the last few days (MSI 5090 gaming trio), so at least it exists out there.
 
That sucks! Which card was it? I'm kinda scared that this will happen to me too.
It says it will arrive 13th february, but i'm not holding my breath.
Good thing is that i've seen a few folks been getting the same card the last few days (MSI 5090 gaming trio), so at least it exists out there.

Gigabyte windforce. Only one left I could get after pissing around trying to get the FE
 
Oh man, WTF shall I buy instead.

I have everything on hand, the only part I need is a GPU

Bryan Cranston Reaction GIF
Same here. I have a 4080 super I can return by end of month, but still.

I'll try to get a 4090 used and wait it out. Or a 5080 once the 900 models are available. But that may take a while.

As other have said, I 100% believe a 5080ti with more ram and maybe some factory OC/OV is in the works.

The current 5080 is laughable. It's just a 4080 super with some fancy (but still useless) features.
 

Moses85

Member
Same here. I have a 4080 super I can return by end of month, but still.

I'll try to get a 4090 used and wait it out. Or a 5080 once the 900 models are available. But that may take a while.

As other have said, I 100% believe a 5080ti with more ram and maybe some factory OC/OV is in the works.

The current 5080 is laughable. It's just a 4080 super with some fancy (but still useless) features.
Tried the same, but the pre owned 4090 is up to 2000€
 

eNT1TY

Member
Well someone lucked out at B&H Photo for my Asus Tuf 5090 that i decided not to pick up after all; couldn't justify the price in the end. I fomo'd it while out of town for in store pick after returning from vacation. If any thing ill stick to my original plan and get a 5070ti when they release. I need to stop doing shit like this. Did it with the ps5 pro as well.
 

analog_future

Resident Crybaby
Well someone lucked out at B&H Photo for my Asus Tuf 5090 that i decided not to pick up after all; couldn't justify the price in the end. I fomo'd it while out of town for in store pick after returning from vacation. If any thing ill stick to my original plan and get a 5070ti when they release. I need to stop doing shit like this. Did it with the ps5 pro as well.

How'd you even get an order in at B&H? I was under the impression that they hadn't even made anything available for sale there yet.
 

simpatico

Member
Anyone know where I can get a technical explanation for the newish trend of games being very vRAM heavy? It seemed to come along the same time as the storage thing. Layman perspective would wonder if they're tied together? I think of a lot of games with absolutely glorious, high detail texture work that do not come close to cracking 16GB at 4k.

This is the most expensive graphical leap we've ever had. Red Dead 2, Arkham Knight and Uncharted 4, compared to STALKER 2, Silent Hill 2R and UE5 in general makes zero sense. Yeah it looks better, but the cost in vRAM and flops is so fucking high for bump we got. I'd love it if some guy found a way to chart this visually.
 

G-DannY

Member
Anyone know where I can get a technical explanation for the newish trend of games being very vRAM heavy? It seemed to come along the same time as the storage thing. Layman perspective would wonder if they're tied together? I think of a lot of games with absolutely glorious, high detail texture work that do not come close to cracking 16GB at 4k.

This is the most expensive graphical leap we've ever had. Red Dead 2, Arkham Knight and Uncharted 4, compared to STALKER 2, Silent Hill 2R and UE5 in general makes zero sense. Yeah it looks better, but the cost in vRAM and flops is so fucking high for bump we got. I'd love it if some guy found a way to chart this visually.
poor and lazy optimization, supported by upscaling and framegen shenanigans
 
Top Bottom