Nintendo's Witch
Member
lol fuck no
The RTX5080 certainly reminds me of the 'infamous' 12GB 4080 (aka 4070ti 12GB), but this time around Nvidia have not announced a 20% better card compared to the RTX5080, so people will not pressure them to change the names. However, Nvidia may be planning to release a true RTX5080 (with more shader cores and VRAM) in 8-12 months time and they will call it either 5080ti or 5080 super.Yeah, the current humongous die size doesn't help, it's such a small upgrade compared to the 4xxx series. And on top more consumptions and more heat due to that. Never been less excited for a GPU launch in many years. But they've nailed it on the software side at least, thanks to the new upscaling models.
The 5080 definitely reminds me of the "unlaunched" 12GB 4080 from a couple of years ago.
There are 5160x2160 monitors coming out this year where this card is already out of memory in games like Rift Apart.The comparison with the RTX3070 is very bad. This card was released at the same time as the PS5 and had less VRAM than that console. It was a recipe for catastrophe. The RTX3070 was VRAM limited from day zero.
The RTX5080 should have more than 16GB's considering it's price (VRAM is relatively cheap), because when PS6 ports start releasing on PC even 16GB may be not enough to match the PS6 settings, but right now 16GB VRAM is still plenty. Most games use between 9-12GB VRAM even at 4K. There are very few PT games that can alocate more than 16GB VRAM (PT increase VRAM a lot) at 4K native, but not even the RTX5090 with 32GB VRAM has the power to run Cyberpunk or Indiana Jones with PT at 4K native with 60fps. You need to use DLSS in these extremely demanding PT games regardless of what GPU you have and then even 16GB VRAM is enough.
With the recent uptade to Alan Wake 2 mega geometry decreased VRAM allocation by about 1.5GB. I see around 11-12GB VRAM allocation (and that's not even usage) at 4K DLSS balance with PT, but let's pretend that 16GB VRAM is a real problem.
Not yet
There are even 8K monitors, and games can use up to 30GB of VRAM at this resolution, so if you are that concerned about VRAM usage, the RTX5090 is the only card for you.There are 5160x2160 monitors coming out this year where this card is already out of memory in games like Rift Apart.
DLAA
DLSS
DLSS + framegen
The last shot clearly shows the card out of memory. The wattage collapses while PCIe bus load goes through the roof.
That's a certainty considering how weirdly underclocked the card is. The only reason for doing that is to leave space for something around or just over 4090 performance in the line-up.The RTX5080 certainly reminds me of the 'infamous' 12GB 4080 (aka 4070ti 12GB), but this time around Nvidia have not announced a 20% better card compared to the RTX5080, so people will not pressure them to change the names. However, Nvidia may be planning to release a true RTX5080 (with more shader cores and VRAM) in 8-12 months time and they will call it either 5080ti or 5080 super.
The RTX5080 certainly reminds me of the 'infamous' 12GB 4080 (aka 4070ti 12GB), but this time around Nvidia have not announced a 20% better card compared to the RTX5080, so people will not pressure them to change the names. However, Nvidia may be planning to release a true RTX5080 (with more shader cores and VRAM) in 8-12 months time and they will call it either 5080ti or 5080 super.
5160x2160 is neither the same as 5k or 8k. It's a 4k monitor with an ultrawide aspect ratio (aka 'ultrawide 2160p') and I don't think it's crazy to expect a bleeding edge $1000-$1400 GPU to have enough memory for that.There are even 8K monitors, and games can use up to 30GB of VRAM at this resolution, so if you are that concerned about VRAM usage, the RTX5090 is the only card for you.
16GB of VRAM is enough for popular resolutions, but at 5K and above, more VRAM would definitely help.
Dude, I've been gaming on PC since 1999 and apart from TITAN cards, every other high-end GPU was VRAM limited in some games and especially at extremely high resolutions. Nvidia build RTX5090 for people like you, who have such extremely high expectations. If you want to play games at extremely high resolutions and are not willing to compromise (adjust some settings), you will have to pay a premium, simple as that.5160x2160 is neither the same as 5k or 8k. It's a 4k monitor with an ultrawide aspect ratio (aka 'ultrawide 2160p') and I don't think it's crazy to expect a bleeding edge $1000-$1400 GPU to have enough memory for that.
Brushing these issues away is exactly why nvidia always gets away with these antics. This isn't the first or second or third time this has h
The issue is that this is a full GB203 chip. So we may get more VRAM with upcoming 3Gb Samsung chips but unless Nvidia is willing to produce a cutdown GB202 we won’t get much more performance.A 5080 was worth the buy if they had used a 384bit bus and 24GB of VRAM. The way it is now, definitely not worth it. I'd wait some months for nvidia to fill the enormous gap between 5080 and 5090.
Was it the Founders Edition? Those seems to have gone up more in value than AIB ones.I thought I'd look into getting a used 4090 due to the availability of the 5090 and that the performance difference isn't particularly large... then I saw used 4090s selling for $2,600+.
Wtf is going on?
I thought I'd look into getting a used 4090 due to the availability of the 5090 and that the performance difference isn't particularly large... then I saw used 4090s selling for $2,600+.
Wtf is going on?
There are 5160x2160 monitors coming out this year where this card is already out of memory in games like Rift Apart.
DLAA
DLSS
DLSS + framegen
The last shot clearly shows the card out of memory. The wattage collapses while PCIe bus load goes through the roof.
I kinda like the looks of the Palit Master 5090sI am only interested in the FE and was not able to get one.
3rd Party cards are imo
![]()
Wait, really? Which monitors? I'd love to upgrade my G9 OLED to a newer version with that resolution.
Assuming I can get my hands on a 5090 that is.
I want the founders edition for the sole purpose is that it would work perfectly in a fractal ridge. If I got one, id make thisI am only interested in the FE and was not able to get one.
3rd Party cards are imo
![]()
Cheapest I’ve seen so far is £3400. It’s meant to be £1939 here.Not yet
I want the founders edition for the sole purpose is that it would work perfectly in a fractal ridge. If I got one, id make this
Very neat, but this is my absolute fabourite.
What are you upgrading from?My 5090 which I was told was supposed to be delivered this week, I've now been told they don't know when it'll be back in stock.
Motherfuckers you took my money 4 days ago. Fuck I'm so annoyed.
2080What are you upgrading from?
If a less than $2300 MSRP model falls into my lap, then sure I’ll upgrade since my 4090 will pay off most of it, if demand remains high.My 4090 is more than catering to my needs until the next offering. Zero interest in the 5000 series.
You best make nice with it, cuz you two aren’t getting divorced just yet!2080
That sucks! Which card was it? I'm kinda scared that this will happen to me too.My 5090 which I was told was supposed to be delivered this week, I've now been told they don't know when it'll be back in stock.
Motherfuckers you took my money 4 days ago. Fuck I'm so annoyed.
That sucks! Which card was it? I'm kinda scared that this will happen to me too.
It says it will arrive 13th february, but i'm not holding my breath.
Good thing is that i've seen a few folks been getting the same card the last few days (MSI 5090 gaming trio), so at least it exists out there.
Same here. I have a 4080 super I can return by end of month, but still.Oh man, WTF shall I buy instead.
I have everything on hand, the only part I need is a GPU
![]()
Tried the same, but the pre owned 4090 is up to 2000€Same here. I have a 4080 super I can return by end of month, but still.
I'll try to get a 4090 used and wait it out. Or a 5080 once the 900 models are available. But that may take a while.
As other have said, I 100% believe a 5080ti with more ram and maybe some factory OC/OV is in the works.
The current 5080 is laughable. It's just a 4080 super with some fancy (but still useless) features.
I’m giving my 4080 away…. to someone in my family.
Neogaf is family.I’m giving my 4080 away…. to someone in my family.
I missed some at 1300-1500, now they're all about 1600-2000.Tried the same, but the pre owned 4090 is up to 2000€
Hey, it’s me, your long lost brother!I’m giving my 4080 away…. to someone in my family.
No. It was like a zotac or something.Was it the Founders Edition? Those seems to have gone up more in value than AIB ones.![]()
I get that. Bur this just seems totally unreasonable. And where'd all the 40 series stock go?![]()
Supply and demand - Wikipedia
en.wikipedia.org
Well someone lucked out at B&H Photo for my Asus Tuf 5090 that i decided not to pick up after all; couldn't justify the price in the end. I fomo'd it while out of town for in store pick after returning from vacation. If any thing ill stick to my original plan and get a 5070ti when they release. I need to stop doing shit like this. Did it with the ps5 pro as well.
poor and lazy optimization, supported by upscaling and framegen shenanigansAnyone know where I can get a technical explanation for the newish trend of games being very vRAM heavy? It seemed to come along the same time as the storage thing. Layman perspective would wonder if they're tied together? I think of a lot of games with absolutely glorious, high detail texture work that do not come close to cracking 16GB at 4k.
This is the most expensive graphical leap we've ever had. Red Dead 2, Arkham Knight and Uncharted 4, compared to STALKER 2, Silent Hill 2R and UE5 in general makes zero sense. Yeah it looks better, but the cost in vRAM and flops is so fucking high for bump we got. I'd love it if some guy found a way to chart this visually.