• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Report: RTX 5090 to include 32GB of VRAM

SolidQ

Member
they just care about other features like raw performance
Same as me, only care about pure price/perfomance, doesn't care about RT yet/Upscalers.

That one an interesting poll with 60k+ voters
589d2f77094ae2f833fd6036abed21b5.png
 

StereoVsn

Gold Member
That's copium, DLSS is huge. I'm considering getting a 5080 next gen even though I generally really dislike nvidia as a corporation simply because AMD hasn't shown it can compete. If it had 24gb vram i'd be there day one.
It looks like 16GB unfortunately on 256bit interface for 5080, which kind of sucks.
 
I just want 16 of vram for the 5060 so I could use that on my 1080p monitor for GTA VI, Modded Fallout 5, and Modded Elder Scrolls VI.

Currently still on the 1000 series

I7 5820K
1 TB HDD
GTX 1070 8gb vram
16gb of system ram
1080p monitor

Don't wanna upgrade since I'm good with the games I currently play like World of Warcraft, Dead by Daylight, GTA V Online, Modded single player Bethesda game .
 
Last edited:
From what I’ve played of pc gaming 4gb ram did a lot, I think 8, 16 gb of ram is safe.
8gb of VRAM was good for awhile in the past but no longer even on 1080p. Get 16gb of vram just to be safe even if you're on 1080p. Lots of newer games are pushing 8gb vram or more now.
 
Last edited:
Same as me, only care about pure price/perfomance, doesn't care about RT yet/Upscalers.

That one an interesting poll with 60k+ voters
589d2f77094ae2f833fd6036abed21b5.png
The 1000 series was the best, it lasted the longest out of all the gens. Started with gtx 260 -> gtx 670 -> 970 -> 1070 and haven't upgraded since. Next upgrade is going to be 5060, was going to go for the 70 tier but with how overpriced things are now, I'm aiming for 5060 instead of 5070.
 

kevboard

Member
That interesting poll from TPU. Only 1.3% care about upscaling
29b1adbd5ad514726742906821281370.png

developers themselves have noted that in games that support it, more than 90% of users turn DLSS on.

so maybe they "don't care" because they already have it, it is by now the default for them, and they don't even really think about it as something they "need".
if something is always there and available for you, of course after a while you stop realising just how default of a setting it has become.

Imagine if one of the poll answers was "temporal anti-aliasing". I bet it would get even less votes. as it is already a given.


the fact that raster performance is above RT performance also tells me a lot.
a 4090 can already run rasterised graphics at such a high level performance, that there's almost nowhere to go from there anymore.

you can literally play many modern games like Death Stranding, Assassin's Creed Valhalla or Hitman 3 at native 8k 30fps+ on that thing. soon mid range cards will replace it and such raster performance will be the default.

all in a world where rasterised graphics basically hit their limit. with mesh shaders, nanite and similar tech slowly eliminating pop-in and slowly giving us near infinitely detailed assets.
In such a world the only way to go from here is raytracing and ML based reconstruction to help raytracing run faster on current hardware.

so to me these poll results make little sense both from a high end and a low to mid range point of view.

from a high end point of view, raytracing is the only way forward. and from a low-mid range point of view ML based Reconstruction tech not only boosts their image quality in the immediate future, but also help keep the card relevant for longer.

TLDR: very weird poll results
 

Bojji

Member
PCMR have gotten soft.

In 2016: "Hahaha, look at you console peasants with your chequerboard upscaling, we in PCMR can run native resolutions and high framerates"

In 2024: "Actually guys, Image upscaling and frame generation is important for me and my $1200 GPU"

Nothing changes, top GPU right now is more than 2x more powerful than most powerful console. Anyone can play at native 4k but why do that when you can have 50% (or more) better FPS with comparable image quality via DLSS?
 

Gaiff

SBI’s Resident Gaslighter
PCMR have gotten soft.

In 2016: "Hahaha, look at you console peasants with your chequerboard upscaling, we in PCMR can run native resolutions and high framerates"

In 2024: "Actually guys, Image upscaling and frame generation is important for me and my $1200 GPU"
Yet, DLSS1 got absolutely shat on and nobody wanted it. DLSS now is almost as good as native while being 30% more performant when using quality. PC gamers are rightfully embracing it, but you want them to be morons by continuing to eschew upscaling because they didn't care about checkerboarding which was nowhere near as good as DLSS?

Yeah, stupid post. "Don't use DLSS PCMR. Stick to native even though DLSS is almost as good but performs much better!"
 

ap_puff

Member
Yet, DLSS1 got absolutely shat on and nobody wanted it. DLSS now is almost as good as native while being 30% more performant when using quality. PC gamers are rightfully embracing it, but you want them to be morons by continuing to eschew upscaling because they didn't care about checkerboarding which was nowhere near as good as DLSS?

Yeah, stupid post. "Don't use DLSS PCMR. Stick to native even though DLSS is almost as good but performs much better!"
I think the danger here is devs being lazy and using it as a crutch, we already have default-on FSR in some games that make them look like utter crap. AMD needs to get its shit together already, I can't believe they waited 2 years between FSR updates and it's still mostly trash
 

Gaiff

SBI’s Resident Gaslighter
I think the danger here is devs being lazy and using it as a crutch, we already have default-on FSR in some games that make them look like utter crap. AMD needs to get its shit together already, I can't believe they waited 2 years between FSR updates and it's still mostly trash
Seems to be already happening.
 

kevboard

Member
Yet, DLSS1 got absolutely shat on and nobody wanted it. DLSS now is almost as good as native while being 30% more performant when using quality.

yup. people forget that DLSS1 was worse than FSR2. it was absolutely horrifically bad.

PC players will brutally shit on your tech if it sucks, and embrace it if it works.

DLSS 2.0+ is almost literally free performance at this point. Having to only render 50-75% of the pixels while getting close to or better than native quality is just not something anyone should ignore.

it is of course dependent on the game. in extreme cases like Doom Eternal you can get image quality that beats the 4k image of the console versions while rendering only 25% of the pixels... like... how crazy is that?
so why shouldn't the "PC master race" jump on it?
"I can render 25% of your pixels and still outclass you!" what a flex is that!? work smarter not harder.
 
Last edited:

Buggy Loop

Member
With it drawing so much power, won’t it wear it out faster? It can’t be easy to crank through that much juice?

It’s current density, barely changes design to design, but node to node the density increases.

It also means a huge motherfucking chipset, as bigger area by that current density means more current going into the chipset.

As long as they don’t go overboard with performance tweaks, if the current density is by design/standard, we’re just seeing the design’s inherent load.

Now for that current to go through power cables… that’s something else entirely 😅
 
Last edited:
Yet, DLSS1 got absolutely shat on and nobody wanted it. DLSS now is almost as good as native while being 30% more performant when using quality. PC gamers are rightfully embracing it, but you want them to be morons by continuing to eschew upscaling because they didn't care about checkerboarding which was nowhere near as good as DLSS?

Yeah, stupid post. "Don't use DLSS PCMR. Stick to native even though DLSS is almost as good but performs much better!"
Yeah its still funny how goalposts change.

Consoles were shit on before for using upscalers to run at higher resolutions than optimal with good enough performance. Nowadays, PC gamers are doing the same thing. Its just funny how things swing around.

I'm not saying not to use DLSS. Use it if you want. It shouldn't be needed though.
 

diffusionx

Gold Member
Yeah its still funny how goalposts change.

Consoles were shit on before for using upscalers to run at higher resolutions than optimal with good enough performance. Nowadays, PC gamers are doing the same thing. Its just funny how things swing around.

I'm not saying not to use DLSS. Use it if you want. It shouldn't be needed though.
I agree it shouldn't be needed but when you have GPUs that come out every 2 years, and with massive price increases, then it's tough. And yea it's by design.

Like what if we had this year a 6070 that ran as fast as the 4090 for $300. No real need for DLSS if that was the case.
 

Gaiff

SBI’s Resident Gaslighter
Yeah its still funny how goalposts change.

Consoles were shit on before for using upscalers to run at higher resolutions than optimal with good enough performance. Nowadays, PC gamers are doing the same thing. Its just funny how things swing around.
They haven't and DLSS1 being ridiculed disproves the false narrative you're peddling. The goalposts haven't been shifted. No one wanted DLSS1 because it sucked. People want DLSS2-3 because they're good. You wanted people to embrace garbage tech?
 

nemiroff

Gold Member
That interesting poll from TPU. Only 1.3% care about upscaling
29b1adbd5ad514726742906821281370.png
Eh, no. The voters could only vote on the one thing they care most about. So unless the voters were able to vote on multiple items it does not mean they don't care about upscaling.

PCMR have gotten soft.

In 2016: "Hahaha, look at you console peasants with your chequerboard upscaling, we in PCMR can run native resolutions and high framerates"

In 2024: "Actually guys, Image upscaling and frame generation is important for me and my $1200 GPU"
That's not what happened at all. You have a disingenuous or faulty memory. Obviously, AI upscaling is relatively mature technology now.
 
Last edited:

Bluntman

Member
5080 still gonna be 16GB for sure.

It will be but not because Nvidia is cheap, but because otherwise AI powerusers would buy them up. Like the cryptomining era, gamers wouldn't be able to buy GPU-s.

That's why the 5090 will be an even more completely different category than the 4090 before it.

And it's going to be extremely expensive. Because Nvidia doesnt want to lose out of profits on pro users buying this instead of the 10k dollar pro products.
 
Same as me, only care about pure price/perfomance, doesn't care about RT yet/Upscalers.

That one an interesting poll with 60k+ voters
589d2f77094ae2f833fd6036abed21b5.png
As of today, there are more than 100 games with RT support, and new RT games are being added every week. People who build a new GPU right now should think about RT performance, because what's the point of buying a new PC for old games, when even old and cheap GPU will run those games. Some games no longer even support raster, and Nvidia card is crushing AMD when it comes to RT performance.

For example I have compared the performance of my 4080S to the 7900XTX in Black Myth Wukong and I have a 240% better performance with the RT in this game.
 
Last edited:
yup. people forget that DLSS1 was worse than FSR2. it was absolutely horrifically bad.

PC players will brutally shit on your tech if it sucks, and embrace it if it works.

DLSS 2.0+ is almost literally free performance at this point. Having to only render 50-75% of the pixels while getting close to or better than native quality is just not something anyone should ignore.

it is of course dependent on the game. in extreme cases like Doom Eternal you can get image quality that beats the 4k image of the console versions while rendering only 25% of the pixels... like... how crazy is that?
so why shouldn't the "PC master race" jump on it?
"I can render 25% of your pixels and still outclass you!" what a flex is that!? work smarter not harder.
I like to use both DLSS balance and DLDSR at the same time. The image quality always looks better than TAA native or DLAA because DSR is basically downsampled image but without performance penalty thanks to "DLSS balance" upscaling. If the game has a good DLSS implementation, even DLSS Ultra performance + DLDSR looks better than TAA native and I get 2x better performance in such game.
 

Thebonehead

Gold Member
PCMR have gotten soft.

In 2016: "Hahaha, look at you console peasants with your chequerboard upscaling, we in PCMR can run native resolutions and high framerates"

In 2024: "Actually guys, Image upscaling and frame generation is important for me and my $1200 GPU"
How Dare You Greta GIF
 

SweetTooth

Gold Member
Nothing changes, top GPU right now is more than 2x more powerful than most powerful console. Anyone can play at native 4k but why do that when you can have 50% (or more) better FPS with comparable image quality via DLSS?
For 4x the price .. fantastic value 🤣
 

FireFly

Member
Yeah its still funny how goalposts change.

Consoles were shit on before for using upscalers to run at higher resolutions than optimal with good enough performance. Nowadays, PC gamers are doing the same thing. Its just funny how things swing around.

I'm not saying not to use DLSS. Use it if you want. It shouldn't be needed though.
It's coming back around with PSSR.
 

Bojji

Member
For 4x the price .. fantastic value 🤣

I never said that 4090 is good value :)

Where I live I can buy Pro for:

X1gjYqv.jpeg


4090 for:

0hzqwmJ.jpeg


That's 2.4x more. Power difference between 4090 and 7700xt is 2.2:

wODIJup.jpeg


Between 4090 and 4070 (you can say that Pro has more feature parity with this one) is exactly 2x.

Still bad value but not 4x. Probably depends where you live.
 

Rivdoric

Member
What i do wonder is how did they manage to squeeze a 600W GPU into a 2-slot cooler while they used a 3-slots one for 450W 4090.

I know they projected a 600W GPU for 4090 and went 450W in the end and didn't change the cooling design but the question still stand.
 

SweetTooth

Gold Member
I never said that 4090 is good value :)

Where I live I can buy Pro for:

X1gjYqv.jpeg


4090 for:

0hzqwmJ.jpeg


That's 2.4x more. Power difference between 4090 and 7700xt is 2.2:

wODIJup.jpeg


Between 4090 and 4070 (you can say that Pro has more feature parity with this one) is exactly 2x.

Still bad value but not 4x. Probably depends where you live.

Oh.. I didn't know you can play games on the GPU alone! News to me

Thanks
 

Dr.D00p

Member
This is a scary response to my comment. Things are getting weird.

Why?

The 90 class is meant for people for whom dropping $2K+ on a GPU is no more significant than you or I dropping $5 on a cheeseburger.

It sucks that PC gaming is like that now but it is what it is.

..and besides, despite all the hype, 90 class sales only represent 1-2% of the GPU market.
 

Bojji

Member
Oh.. I didn't know you can play games on the GPU alone! News to me

Thanks

Discussion was about GPU alone so far.

For someone that already has pc with good enough CPU, GPU alone is needed for upgrade (plus you get money from selling old GPU).

Anyway, xx90 series is not for people that seek value. 5090 will have even more ridiculous price.
 

SweetTooth

Gold Member
Discussion was about GPU alone so far.

For someone that already has pc with good enough CPU, GPU alone is needed for upgrade (plus you get money from selling old GPU).

Anyway, xx90 series is not for people that seek value. 5090 will have even more ridiculous price.

Bolded can be applied to PS5 for people upgrading to PRO?! So my original point is still valid, anyways whatever suit you man

Also I can apply your end quote to PS5PRO and say that its not for people who seek value too!!

Its crazy to me that you had this conclusion and yet you are in every PRO thread crying about its price 🤣
 
Last edited:

Bojji

Member
Bolded can be applied to PS5 for people upgrading to PRO?! So my original point is still valid, anyways whatever suit you man

Also I can apply your end quote to PS5PRO and say that its not for people who seek value too!!

Its crazy to me that you had this conclusion and yet you are in every PRO thread crying about its price 🤣

You can't change just one or two things inside PS5 to get Pro, you need change the whole console and buy disk drive to reach feature parity with old system.

Check out PS5 price discussion here, most people don't like price of this thing at all.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Oh, look, another thread is devolving into the usual PS5 Pro vs PC pissing match.
 

Bojji

Member
Oh, look, another thread is devolving into the usual PS5 Pro vs PC pissing match.

It's inevitable, this forum is like ~70% PlayStation owners so every thread will always get some console elements in it.

Back to the topic: What's the predicted price for 5080, 1000$ or 1200$? I don't expect anything lower from company that don't have any competition and is focused on Ai market right now (GPUs are like their hobby right now, hahaha).
 

MarV0

Member
Why?

The 90 class is meant for people for whom dropping $2K+ on a GPU is no more significant than you or I dropping $5 on a cheeseburger.

It sucks that PC gaming is like that now but it is what it is.

..and besides, despite all the hype, 90 class sales only represent 1-2% of the GPU market.
You couldn't be further from the truth.

Taking the average US salary of 60k as a baseline in order for $2000 to feel like $5 someone needs an annual salary of $24 million. There is very few people that earn that and I can assure you they are not into video games and graphics cards.

Most people buying 4090s and 5090s are loading up their credit cards. That is the sad reality.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Why?

The 90 class is meant for people for whom dropping $2K+ on a GPU is no more significant than you or I dropping $5 on a cheeseburger.

It sucks that PC gaming is like that now but it is what it is.

..and besides, despite all the hype, 90 class sales only represent 1-2% of the GPU market.

It's scary for the part that I bolded. It's the fact that you so easily accept it that's scary. $2,000 for a GPU should never be okay.
 
No way LOL! Are you serious or you being funny?

If it's the cut that has 32 GB, yeah it's going to be more. I'd say $2499.

The earlier rumors had speculated that they were going to use the one that is deeper cut and also cut the memory bus to 448-bit. They might still use that in some other product. That one I think would be $1999.

Do keep in mind most of the GB202 sales are going to be workstation/AI products which will be double,triple the price of the 5090.
 
Last edited:
Same as me, only care about pure price/perfomance, doesn't care about RT yet/Upscalers.

That one an interesting poll with 60k+ voters
589d2f77094ae2f833fd6036abed21b5.png

That actually gels with half of Sony's user base being on PS4. They aren't playing the latest games, they are playing Fortnite and GTA V.
 
Top Bottom