• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 5090 is $1,999 5080 $999 5070 Ti $749 5070 $549 (Availability Starting Jan 30 for RTX 5090 and 5080)

SlimySnake

Flashless at the Golden Globes
Sigh, unfortunately I'm going to have to sit this one out.
I'm trying to save for a house this year and so sacrifices must be made.
DLSS is going to have to carry my RTX 3090.
Think of me when you guys are gaming at 4K 240hz.

snake-mgs.gif
Congrats in advance.
 

Bulletbrain

Member
Sounds like an absolute bunch of horses bullocks frankly. Magical power of 12gb?
It's just the effect of MFG Vs FG. Scenario: game A 60fps native, 120fps with FG on a 4090. Same game with same settings on a 5070 maybe 30fps native (so, a half slower native) but will still reach 120fps with MFG.
 
Last edited:

sendit

Member
With multi frame gen, what is the point of a 5090? If i can get a consistent frame lock 120FPS at 4K with all the bells and whistlers on, I would be happy.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
Titan X launched @ $999 in 2015 ($1305 with inflation, as if wages adjusted lol) and was the overkill card you didn't need (for the hardcorest as you'd expect the next high end model to achieve that performance for 40% off the cost) now this shit's embedded in regular tiers you need for decent perf.
 
Last edited:

Mikado

Member
That's not nearly as bad as I was afraid of. I'm still rocking a 2070 Super that demolishes everything thrown at it (granted, at only 1080p). Looks like the 5070 Super is in the right range price-wise, so long as it's - you know - not crap.
 
Last edited:
Still not paying $550 for a 12GB RAM GPU.

Dammit, I was thinking maybe upgrade my 6900XT but not for a card with 12gb.

Yeah the prices are still rip offs, it's just they didn't make it as overpriced as people thought so that's how they'll get people to cheer for it. It's a good trick.

The RTX 5070 at 550 should really have been 16GB. 12GB is low end at 2025 and should have been for something like the 5060 at 300 to 350 max.
 

Bojji

Member
thats literally what they did with the 3070 which was released for $499 and performed the same as a $1,200 2080ti.

But this was actual raster/RT performance across the board. This time to reach that they need to generate multiple frames IN GAMES THAT SUPPORT IT.

From their own slides we know actual 5070 performance that is probably slightly more than half of 4090.
 

Gubaldo

Member
According NVIDIA Slide 5070 looks to be ~45% faster than 4070 in plague .
That would make it close to 4080 around 96fps
RT_APTR-p.webp
 
Last edited:

Xyphie

Member
5090 price is surprisingly good I have to say. 5090 only being 2x the 5080 is really good considering it's essentially "2x 5080" in terms of hardware, if it scales well I might actually eat noodles for a month. Otherwise the 5070 Ti is probably the smart buy before Super models with 24Gbit GDDR7 hits as it should be 84SM/70SM = ~83% of the 5080 perf at 75% of the price.

5090 going to have 1080 Ti-like longevity given it likely has more or equal VRAM to a higher-end PS6 (256-bit 24-32GB?) and higher shader perf.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Don't know what kepler trying to say. There red words. Plans have changed multiple times.
For those who don't know, white color on MLID it's not confident info.

MLiD throws shit at the wall and sees what sticks then comes out and says ohh see I was right that one time out of 100.
He doesnt get briefed about dick.
What he reports as confident, is literally what the likes of Kepler, Mega and Kopite post then he tries to take credit for it.
 

Heisenberg007

Gold Journalism
5090 price is surprisingly good I have to say. 5090 only being 2x the 5080 is really good considering it's essentially "2x 5080" in terms of hardware, if it scales well I might actually eat noodles for a month. Otherwise the 5070 Ti is probably the smart buy before Super models with 24Gbit GDDR7 hits as it should be 84SM/70SM = ~83% of the 5080 perf at 75% of the price.

5090 going to have 1080 Ti-like longevity given it likely has more or equal VRAM to a higher-end PS6 (256-bit 24-32GB?) and higher shader perf.
When do you think the Super models will drop?
 

Buggy Loop

Gold Member
I notice AMD canceled the announcement of their new GPU today :ROFLMAO:

I wonder if it's because they caught wind of Nvidia lowering the prices and quickly decided to withdraw rather than be humiliated again

Lisa and Jensen are same family

Maybe he whispered what was coming at CES at holiday dinner :messenger_smirking:

If I was AMD tonight, I think I would collapse on the floor

They've basically redid the whole fucking pipeline of how GPUs work in games with AI, Shaders, BVH structure, Path tracing, texture management.

Path tracing on Nvidia cards with the above upgrades will suddenly get way faster, and ALL RTX cards will get it from what I read. Neural Shaders is the one I'm not too sure about, but RTX Mega geometry and Neural cache radiance will already help tremendously.
 
When you compare the CUDA cores between last gen and this gen. You get a big bump up from 4090 to 5090 (16384 to 21760) but only a very small bump up between the 4080 to 5080 (9728 to 10752) and 4070 to 5070 (5888 to 6144).
 

Buggy Loop

Gold Member
This is all new

RTX Neural Shaders​

Train and deploy neural networks within shaders to unlock new compression and approximation techniques for next-generation asset generation.

RTX Neural Texture Compression​

Use AI to compress textures with up to 8x VRAM improvement at similar visual fidelity to traditional block compression at runtime.

RTX Texture Filtering​

Randomly samples textures after shading and filters difficult volumes, reducing artifacts and improving image quality.

RTX Neural Materials​

Use AI to compress shader code of complex multi-layered materials for up to 8X faster material processing to bring real-time performance to film-quality assets.

RTX Mega Geometry​

Accelerate BVH building for cluster-based geometry systems, enabling up to 100x more ray-traced triangles and better performance in heavily ray-traced scenes.

RTX Global Illumination (RTXGI)​

Scalable solution to compute multi-bounce indirect lighting.
  1. Neural Radiance Cache (NRC): Use AI to predict the amount of light that's emitted from or passing through a specific area
  2. Spatial Hash Radiance Cache( SHaRC): Fast and scalable algorithm to compute light in a given area
  3. Dynamic Diffuse Global Illumination (DDGI): Probe-based solution that delivers multi-bounce indirect lighting without lightmaps or baking.

Insanity
 

hlm666

Member
So… nobody cares for VRAM anymore?
Well there is a card in the stack with 32GB if that is your most important feature. If that's a no go maybe a 2nd hand 4090 will carry you with 24GB.

We are at a point where just throwing vram at the problem and cranking texture resolution is not getting you the best looking image on screen, there's a few features in there that could alleviate some of the vram pressure aswell, Mega geo seems to reduce bvh footprint and nueral textures could be a big one but have to wait and see how that pans out.
 

peish

Member
FE model looks so classy. Those AiB models look like they'd do a better job but the size difference is pretty large and nowhere near as sleek.

Looks like 2 coffee stains ..


Sooo when do we get reviews and tech specs analysis
 

Bojji

Member
This is all new

RTX Neural Shaders​

Train and deploy neural networks within shaders to unlock new compression and approximation techniques for next-generation asset generation.

RTX Neural Texture Compression​

Use AI to compress textures with up to 8x VRAM improvement at similar visual fidelity to traditional block compression at runtime.

RTX Texture Filtering​

Randomly samples textures after shading and filters difficult volumes, reducing artifacts and improving image quality.

RTX Neural Materials​

Use AI to compress shader code of complex multi-layered materials for up to 8X faster material processing to bring real-time performance to film-quality assets.

RTX Mega Geometry​

Accelerate BVH building for cluster-based geometry systems, enabling up to 100x more ray-traced triangles and better performance in heavily ray-traced scenes.

RTX Global Illumination (RTXGI)​

Scalable solution to compute multi-bounce indirect lighting.
  1. Neural Radiance Cache (NRC): Use AI to predict the amount of light that's emitted from or passing through a specific area
  2. Spatial Hash Radiance Cache( SHaRC): Fast and scalable algorithm to compute light in a given area
  3. Dynamic Diffuse Global Illumination (DDGI): Probe-based solution that delivers multi-bounce indirect lighting without lightmaps or baking.

Insanity

Most of the tech "RTX something, something" is not really used in games. There are few games with RTX storage solution and RTXGI. Most of devs use stuff that works on consoles too.
 

Killer8

Member
Pricing way better than I expected. I'm shocked. Most of the DLSS4 improvements are being backported to previous generations so I'm good with my 4070 Super purchase for now. Just a great presentation all round. Nvidia's near monopoly on the GPU space is not undeserved.
 

Gaiff

SBI’s Resident Gaslighter
thats literally what they did with the 3070 which was released for $499 and performed the same as a $1,200 2080ti.
It literally wasn’t what they did. Do you sometimes post things that aren’t false?
 
Last edited:

Buggy Loop

Gold Member
Most of the tech "RTX something, something" is not really used in games. There are few games with RTX storage solution and RTXGI. Most of devs use stuff that works on consoles too.

Few games with RTXGI? I mean that's pretty much all ray traced/path traced games in the past years? What do you mean?

RTX I/O I can give it to you, basically useless, but just like direct storage. Seems PC weren't really I/O limited to begin with. Even the ones implemented barely show a difference.

Alan Wake 2 is already gonna update for RTX Mega Geometry. "more" is maybe neural cache radiance.

It's potentially a HUGE performance saver. If a dev implemented path tracing in the last few years, he would be dumb to not upgrade.

DHWtlIp.png


All of it is going to UE5 SDK too
 

Bojji

Member
Few games with RTXGI? I mean that's pretty much all ray traced/path traced games in the past years? What do you mean?

RTX I/O I can give it to you, basically useless, but just like direct storage. Seems PC weren't really I/O limited to begin with. Even the ones implemented barely show a difference.

Alan Wake 2 is already gonna update for RTX Mega Geometry. "more" is maybe neural cache radiance.

It's potentially a HUGE performance saver. If a dev implemented path tracing in the last few years, he would be dumb to not upgrade.

DHWtlIp.png


All of it is going to UE5 SDK too


I meant literally RTX GI, the one with nvidia branding. Most devs will use direct storage or multiple other RT GI implementations.

Nvidia RTX tech is cool but just like gamerworks it isn't used in many games because consoles = AMD.
 

Gaiff

SBI’s Resident Gaslighter
3070 was neck and neck with 2080ti
Not what I mean. He says they used the same method to compare the 2080 Ti with the 3070 as they did to compare the 5070 vs 4090. That’s false. The 3070 and 2080 Ti are on par in raster and the charts showed that. The 4090 and 5070 are on par only when the 5070 uses multi-frame generation. Without that, aka, an apples to apples comparison, the 4090 destroys it.
 
Last edited:

Buggy Loop

Gold Member
I meant literally RTX GI, the one with nvidia branding. Most devs will use direct storage or multiple other RT GI implementations.

Nvidia RTX tech is cool but just like gamerworks it isn't used in many games because consoles = AMD.

What are you even talking about

95% of PC ray traced games of any worthwhile RT has been sponsored by Nvidia. We’re not talking about cute RT shadows or a couple of reflections here, GI. GLOBAL ILLUMINATION

Anything path traced is directly Nvidia. Not even an if.

AMD is not even in the picture, don’t even think of bringing consoles into this talk lol

And it’s an SDK. Not « game works ».

Game calls DirectX functions. Compatible with any GPU that speaks direct X DXR / Vulkan.
 
Last edited:
Top Bottom