• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doesn't the RTX "5080" suspiciously look like our late RTX 4080 12GB that ended up being the 4070 Ti?

FingerBang

Member
This is ridiculous. Ada was a scam generation where they basically raised the price of eavh tier by naming them the tier above. They would have done even worse by naming the now called 4070ti a 4080.

If this leak is true, Nvidia will do the same thing again, making player lose another tier, selling the xx70ti as a 4080.

Compared to Ampere you are basically paying overinflated xx80 prices for a xx70 (non ti). I don't even know what is going to happen on the low end. But I imagine you'll now just be gaining a single tier for the same performance, again. A 5060 will perform like a 4070. A 4070 like a 5080 and so on. It sounds good, but its half of what we used to get.

The 5090 sounds like a beast but I really don't know why I would upgrade from a 4090 for gaming.
 

Astray

Member
At some point they will get too expensive for the average buyer and that will hold up game development because people can’t afford gpu’s with cutting edge features. Low amount of VRAM will be the first real issue to arise.
Low VRAM will not become an issue until the new consoles come out. As long as the current consoles are targeted, then 16GB VRAM will be needed for anything heavy, and nothing more.
 

Bojji

Member
I was gonna wait for 5070 but said fuck it and bought 4070ti super.

This GPU has all the things I need, plus selling my 3080ti gives me 3/5 of money already.

Equivalent GPU from 5xxx series is like ~8 months away.
 
Last edited:

Nickolaidas

Member
My opinion as a 4080 owner. Do not buy the 16gb version if you are using it for 4K, try to wait for a 24gb version unless you plan to upgrade again next gen. You basically won’t have any headroom and for a product that will be likely $1000 plus that feels nvidia jerking consumers around again.
Oh yeah. I'm definitely going for 24GB this time around.
 

FireFly

Member
This is ridiculous. Ada was a scam generation where they basically raised the price of eavh tier by naming them the tier above. They would have done even worse by naming the now called 4070ti a 4080.

If this leak is true, Nvidia will do the same thing again, making player lose another tier, selling the xx70ti as a 4080.

Compared to Ampere you are basically paying overinflated xx80 prices for a xx70 (non ti). I don't even know what is going to happen on the low end. But I imagine you'll now just be gaining a single tier for the same performance, again. A 5060 will perform like a 4070. A 4070 like a 5080 and so on. It sounds good, but its half of what we used to get.

The 5090 sounds like a beast but I really don't know why I would upgrade from a 4090 for gaming.
The price per transistor isn't coming down significantly with new process nodes, so the only real levers Nvidia has to improve performance at the same margins are increasing the clock speeds and optimizing the architecture. That will give you one extra performance tier per-generation.
 
Last edited:

64bitmodels

Reverse groomer.
Compared to Ampere you are basically paying overinflated xx80 prices for a xx70 (non ti). I don't even know what is going to happen on the low end. But I imagine you'll now just be gaining a single tier for the same performance, again. A 5060 will perform like a 4070. A 4070 like a 5080 and so on. It sounds good, but its half of what we used to get.
Personally i'd be fine with this if it weren't for the fact that 5060 will be equipped with a 128 bit bus and 8gb of ram. Ensuring that it'll last 2 years and then have to run low settings for every game after 2027 because its small 8gb of VRAM can't handle the megatextures and RT every new game demands.
 

Bojji

Member
Personally i'd be fine with this if it weren't for the fact that 5060 will be equipped with a 128 bit bus and 8gb of ram. Ensuring that it'll last 2 years and then have to run low settings for every game after 2027 because its small 8gb of VRAM can't handle the megatextures and RT every new game demands.

Maybe by the time it released 3GB chips of GDDR7 will be available and it will be 12GB on 128 bit bus.
 

zeroluck

Member
It makes perfect sense, they are not gonna give you >400mm die on n4p for ~$900 lol, 5080 is best you gonna get, probably the same die size as 4080. If it is really 10% faster than 4090 this is Maxwell 2.0.
 

Bojji

Member
So how much VRAM do you need for 4k DLSS quality at 60fps? Because that's my sweet spot.

Depends on the game, UE5 games can run within 8GB of VRAM, while there are some games (Ratchet, CP with PT and AW2 with PT) that can go ~14-15GB.

I think 16GB is safe until PS6 arrives.
 

Kenpachii

Member
Compared to ampere the 5080 isn't a x80 class card.
Compared to old x90 cards before 3090/4090, 5080 is a x80 class card as its half the spec of a 5090.
 
Last edited:
Low VRAM will not become an issue until the new consoles come out. As long as the current consoles are targeted, then 16GB VRAM will be needed for anything heavy, and nothing more.
I didn’t say today or even tomorrow but if we continue on this curve we’ll get there eventually.
 

kevboard

Member
I didn’t say today or even tomorrow but if we continue on this curve we’ll get there eventually.

I don't think so.
we will see VRAM requirements plateau soon. we already have ridiculously high resolution textures and we already target 4K with 16GB VRAM, and do so easily.

I think 24GB will eventually be optimal, but that will take a long time and I don't think it will go up much from there either.

there are still modern games that run ok on 4GB VRAM. That's less memory than last gen consoles.
I bet you can play Silent Hill 2 remake on 4GB for example. and that game targets current gen only. You'll be more limited by the actual GPU performance than the VRAM amount there.
 
Last edited:

hyperbertha

Member
the distance between the have and have nots in the PC space is increasing dramatically. The great filter to keep the master race pure and force the illegal console immigrants to eat cake/5080's/half the performance.
This is why we must vote Lisa tsu to power. Make Amd-cards great again
 

SolidQ

Member
That always has been known info
8d16e16133f4484b46921b057e0ee502.png
 

KungFucius

King Snowflake
I think this is the case, they learned to rebrand the shit card as the good card, rather than into the card it should be (70 class).

Hugely scummy by NVIDIA, you would think with all this AI money they would just keep pumping out solid generations of good cards at acceptable prices just to keep a positive consumer sentiment, instead they have to let the Business grad's squeeze consumers for everything when cost of living is so high.
They just want to push those who can afford the xx90 card in that direction. It sucks when the xx80 used to be good enough, but for those of us stupid enough to do 3090 when 3080 was less than half the cost and 85% the performance, at least we are getting a better slice of value getting the xx90 cards now. Of course I expect them to jack the price to 2k for the 5090FE and ASUS will charge 2500 for a 100 MHz overclock to their base model and only sell the OC ones and the Strix will actually cost 5090.

Regardless, a failed septic system and a dead hvac unit in the last 8 months is keeping me on my 4090 for the foreseeable future.
 
16GB on a 2025 flagship gaming card is insulting.
I guess. It does have significantly more bandwidth being GDDR7 as well as being on PCIe5. Surely there would be some sort of trade off there. Even the 3090 at 24GB performs 75% than the 4090 at 24GB. So there has to be a correlation between bandwidth, speed, and the amount of memory required. Technology advances all the time. I'm sure one of the worlds biggest companies know that.

I know this isn't the best site to use, but:
 
Last edited:

rm082e

Member
This is ok, right?

Totally depends on who you ask. Given we're half-way through the current console generation, I think most PC players looking to buy a new GPU would expect to have as much RAM as the current consoles. Anything less is "budget" GPU territory in the minds of most gamers. Since the consoles have 16GB, it doesn't seem unreasonable to expect the mid-tier card (which has traditionally been the Nvidia 70 series) to have that amount.

To me, the 5080 feels like it should have 20GB, and the 5070 16GB. Now, can I use data and stats to justify that feeling? Nope. But it's still how I feel. Seeing the recent rumor of the 5080 at literally half the specs of the 5090 was very deflating for me as a 3080 10GB owner who is thinking about upgrading. If it had been 70% the spec of the 5090, I would be more excited.

But, we have to see how it all works out once the products are out and can be tested. Just having more RAM doesn't make cards more performant, so we have to see how these products perform in benchmarks and understand if the RAM limits will be bottlenecks or not. I'll keep an open mind and continue to set a little money aside knowing I'll get something eventually, even if it's a used 4090 on ebay.
 
Totally depends on who you ask. Given we're half-way through the current console generation, I think most PC players looking to buy a new GPU would expect to have as much RAM as the current consoles. Anything less is "budget" GPU territory in the minds of most gamers. Since the consoles have 16GB, it doesn't seem unreasonable to expect the mid-tier card (which has traditionally been the Nvidia 70 series) to have that amount.

To me, the 5080 feels like it should have 20GB, and the 5070 16GB. Now, can I use data and stats to justify that feeling? Nope. But it's still how I feel. Seeing the recent rumor of the 5080 at literally half the specs of the 5090 was very deflating for me as a 3080 10GB owner who is thinking about upgrading. If it had been 70% the spec of the 5090, I would be more excited.

But, we have to see how it all works out once the products are out and can be tested. Just having more RAM doesn't make cards more performant, so we have to see how these products perform in benchmarks and understand if the RAM limits will be bottlenecks or not. I'll keep an open mind and continue to set a little money aside knowing I'll get something eventually, even if it's a used 4090 on ebay.
Sure, but the current console don't have other RAM to fall back on and can't use the full amount of VRAM within it due to OS constraints.

But here is the thing, Sure, the PS5 Pro has 16 GB, but only 12.5 is useable of GDDR6 that runs at 560GB/s on AMD's tech, which isn't the greatest. It uses GDDR6 for both the OS and Gaming on a PCIe4 bus. In the future, I could see the PS6 having 20GB GDDR7 on PCIe 5, while having only 16GB useable which puts the 5080 in line with memory for the PS6, while still outperforming it. But by the time the PS6 arrives, we will have the 60 series which will also be at 20GB GDDR7X or GDDR8 on PCIe5 or PCie6 if it's established, but I don't expect that till the 70 series..

The 5080 on the other hand is running on 16GB of GDDR7 at 896Gb/s on desktops that also may have 16 - 128Gb of DDR5 or DDR6 (when it arrives) with super fast SSD's that can utilize Direct Storage on a PCIe5 bus if need be.

Not only this, but consoles are meant to be energy efficient and not draw a ton of power, so PS5Pro is sitting here at 200W whereas my 5080 can utilize 400W of power on it's own while my CPU uses around the same. It's not really apples to apples.

I still stand by the fact that the 4090 compared to the 3090 is night and day with the same amount of RAM, but people need to look at bandwidth, and speed. These video cards churn data in and out at a fast rate which may lead into less RAM needed to accomplish the job.
 
Last edited:

rm082e

Member
Sure, but the current console don't have other RAM to fall back on and can't use the full amount of VRAM within it due to OS constraints.

But here is the thing, Sure, the PS5 Pro has 16 GB, but only 12.5 is useable of GDDR6 that runs at 560GB/s on AMD's tech, which isn't the greatest. It uses GDDR6 for both the OS and Gaming on a PCIe4 bus. In the future, I could see the PS6 having 20GB GDDR7 on PCIe 5, while having only 16GB useable which puts the 5080 in line with memory for the PS6, while still outperforming it. But by the time the PS6 arrives, we will have the 60 series which will also be at 20GB GDDR7X or GDDR8 on PCIe5 or PCie6 if it's established, but I don't expect that till the 70 series..

The 5080 on the other hand is running on 16GB of GDDR7 at 896Gb/s on desktops that also may have 16 - 128Gb of DDR5 or DDR6 (when it arrives) with super fast SSD's that can utilize Direct Storage on a PCIe5 bus if need be.

Not only this, but consoles are meant to be energy efficient and not draw a ton of power, so PS5Pro is sitting here at 200W whereas my 5080 can utilize 400W of power on it's own while my CPU uses around the same. It's not really apples to apples.

I still stand by the fact that the 4090 compared to the 3090 is night and day with the same amount of RAM, but people need to look at bandwidth, and speed. These video cards churn data in and out at a fast rate which may lead into less RAM needed to accomplish the job.

I agree with everything you said here.

Also, PCMR! BIGGER NUMBERZ ARE BETTER NUMBERZ!!!1! :messenger_open_mouth:
 
Seems like NVIDIA is trying to pull the same tactic, but they've wizened up this time, they won't have another real 5080.

Below are the rumored specs for the 5080 and 5090.

5lUBfqo.png


These are the specs for the RTX 4070 Ti aka 4080 12GB

DXS0rhT.png


Compared to the 4090 and 5090.

CUDA Cores: RTX 4070 Ti has 7680 (47%). The RTX 5080 has 10752 (49%)
Bandwidth: RTX 4070 Ti has 504 GB/s (50%). The RTX 5080 has 896 GB/s (50%)
Memory Bus: RTX 4070 Ti has 192-bit (50%). RTX 5080 has 256-bit (50%)
VRAM Capacity: RTX 4070 Ti has 12GB (50%). RTX 5080 has 16GB (50%)

Rumor is that this 5080 will perform 10% better than the 4090. If such is the case, then that 5090 could be well over 50% faster than the 4090, making the gap between it and the 5080 even larger than the one between the 4090 and 4080. This gives NVIDIA enough space to slot in a 5080 Super, 5080 Ti, and 5080 Ti Super between them and price them accordingly.

Of course, we are also missing the clocks. For all we know, the 5080 could be clocked much higher, resulting in a narrower performance gap than the specs would lead you to believe, but still, I'm wary of that one. Those specs seem low for an 80 class card compared to the flagship.
50 series will have better performance simply because DLSS 4 is exclusive to it.
 

Puscifer

Member
It's called being Nvidia, they can squeeze gamers as much as they like, what are you going to do? Go to AMD? 😂
If I want to keep PC gaming being a high end alternative, yes. Nvidia is too much, I'm sorry. But how crazy is it I'm giving up a single digit of my yearly salary to a single PC component? 10 years ago you could build a high end PC for around 1000/1100 I get things change and prices go up, I could even understand 14/1500 for the same level of PC but for a single component???? No, that's insanity. As it stands this is my last build anyway and the PS5 is seemingly my last console, I'm having more fun playing older titles anyway but I can fully say that yes, console owners looking at the current bar of entry that a thousand dollars gets you a low end 4060/7600 PC isn't appealing and their right to not go in that direction.
 

hinch7

Member
It's called being Nvidia, they can squeeze gamers as much as they like, what are you going to do? Go to AMD? 😂
Considering how many game are using UE5 or intend to, and the advancements in there for Lumen+MegaLights (just recently) thats not really a bad option. For people who can't step up to an Nvidia card with good performance and 16GB+ VRAM.
 
Last edited:

3liteDragon

Member
Yes there is

- GB205 chip has 6400 cores MAX (Full Chip)
- Original 4070 has 5888 cores (AD104)
- 4070 SUPER has 7168 (AD104)
- 4070 Ti has 7680 (Full AD104)
Must be due to architectural improvements, all comes down to the benchmarks
 
Last edited:
Top Bottom