MicroCenter lists RDNA 4 RX 9070 series: RX 9070 XT starting at $699, RX 9070 at $649

Status
Not open for further replies.

Draugoth

Gold Member
According to the listing, the ASRock Challenger RX 9070 card would cost $649, while the RX 9070 XT Steel Legend would retail at $699. We do not know if these are final prices, but generally speaking, Challenger series cards are MSRP models, so that is definitely closer.


[h3][/h3]

MicroCenter RX 9070 pricing (placeholder/preliminary):

  • ASUS AMD Radeon RX 9070 XT Prime Overclocked 16GB: $1,049.99
  • PowerColor AMD Radeon RX 9070 XT Reaper Triple Fan 16GB: $1,100.00
  • Gigabyte AMD Radeon RX 9070 XT Gaming Overclocked Triple Fan 16GB: $899.99
  • ASRock AMD Radeon RX 9070 XT SL Triple Fan 16GB: $699.99
  • Sapphire Technology AMD Radeon RX 9070 XT PULSE Triple Fan 16GB: $899.99
  • XFX AMD Radeon RX 9070 XT Swift Triple Fan 16GB: $729.99
  • PowerColor AMD Radeon RX 9070 Reaper Triple Fan 16GB: $1,099.99
  • ASRock AMD Radeon RX 9070 CL Triple Fan 16GB: $649.99
 
If the final MSRP is anywhere close to any prices in the OP, then Radeon Group:

Paul Bearer Dance GIF by Jason Clarke
 
I really wish AMD was a real competitor, they simply aren't and it's why Nvidia can continue to be assholes and have a monopoly on the PC market, it's tiring.
 
They never fucking learn.

Looks they're about to squander a huge chance to actually gain market share.
Yep, they will try to take short sighted initial profits and drop the prices like $200 when Nvidia stocks their lineup properly in a few months (minus 5090).

It's truly fucking dumb since a lot of people will just wait for Nvidia at these prices.
 
Last edited:
If the definition of insanity is doing the same thing over and over and expecting a different result, then tran Jensen Huang is the craziest mother fucker in the cosmos of unlimited multiverses.
 
Last edited:
Agreed. $600 for the XT is where they need to be to have any meaningful impact.
Either they know the perf is good compared to the 5070ti, so they're pricing it closer, or they just don't care enough.

I'm voting for the latter: I don't think this thing can compete with newer cards (not even with old ones, since dlss4).
 
The price will only make sense once we see how it performs against the 5070 Ti
On second thoughts, it's still likely overpriced, they might be trying to undercut Nvidia by 50 or but honestly it's not enough, it probably trades blows with the 5070 Ti in rasterisation, doubt it comes close in ray-tracing and I also doubt FSR 4 will be anywhere close to DLSS 4.
 
Either they know the perf is good compared to the 5070ti, so they're pricing it closer, or they just don't care enough.

I'm voting for the latter: I don't think this thing can compete with newer cards (not even with old ones, since dlss4).
That's the thing. Even if raster performance is slightly better than a 5070ti, people will still go with Nvidia if the price is anywhere near because of path/ray tracing, DLSS, and maybe AI.

They NEED to undercut them by at least 20% to be even considered by most gamers.
 
AMD are going to fuck up this opportunity of a lifetime they've been presented with aren't they?
 
Last edited:
To be honest I don't get why people are obsessed with market share for GPUs. With consoles they can sell at loss and make money on software but with GPUs? They only get money once...

People want AMD to sell those GPUs with super thin margins. For what purpose exactly?
 
PC gaming is in the shitter because they both know that profits over market penetration is what impresses shareholders nowadays.

Sony knows this too. Hence the $699 underpowered console. And so does MS. hence them literally leaving the console hardware business. Everyone just cares about the fucking profits now. The era of subsidized consoles and cheap PC GPUs is over.
 
To be honest I don't get why people are obsessed with market share for GPUs. With consoles they can sell at loss and make money on software but with GPUs? They only get money once...

People want AMD to sell those GPUs with super thin margins. For what purpose exactly?

That's the thing. We don't know the real $$ behind the bom, let alone the whole process. All we know is what it would take to get our credit cards out.

We're getting them out for $900 5070ti's though...
 
To be honest I don't get why people are obsessed with market share for GPUs. With consoles they can sell at loss and make money on software but with GPUs? They only get money once...

People want AMD to sell those GPUs with super thin margins. For what purpose exactly?

They can't afford a price war with Nvidia. I don't know why peoples don't get it. They get good enough margins and don't start a high risk pricing piss fight with one of the biggest company in the world. Almost killed ATI when they tried that. They have their money from consoles. Even at their most competitive form in many gens, RDNA2 didn't make a dent in market share.

Even if 9070 XT had been placed aggressively against Nvidia, Nvidia would just drop price.... and vast majority would buy Nvidia while saying "finally competition!". The echo chambers of reddit where everyone is on AMD and shits on Nvidia really does not resonate to real-life market shares.

An "underdog disney sport movie" scenario where the team wins in the end is not what would happen here. Both companies love their margin. For the fab availabilities, AMD also prefers selling you a high margin card while you think you are saving GPU market with "team red" in small quantities while they're off to manufacture AI chips in mass production. They're not so different.

Intel has more chances to pierce the mid-range market in a couple of iterations I think, especially if their 18A foundry works out and avoids US tariffs.
 
Last edited:
These will sell out. I used to work at SCAN and retailers have barely existent margin on graphics cards, my staff discount didn't apply because it was so poor. Not a chance you won't see them chancing it.

Aren't 5070's going for a grand?
 
well. even though you cant realistically buy a 5070 Ti at under 1000, some in here will still think this is a bad deal

If there's real demand for these then I have some bad news for ya, AMD "MSRP" is as flexible if not even more with AIBs. I member RDNA 2. Chasing a reference unicorn card when AMD did not even want to manufacture it (they tried to stop and gamers were furious) to begin with was a real shitshow. Don't worry, plenty of powercolor dragon demon OC or some gaming shit like that slapped on a name and you'll pay premium prices.

Asus TUF 3080 with the SAME cooler on 6800XT was more expensive on AMD side during that period when MSRP was technically -$50. Figure that out.
 
Scarcity due to AI and slim margins are 2 different things. I have a hard time believing there's slim margins on an average retail price of $930 for 5070 Ti and $895 for 9070 XT. This is scalping.
 
Fuck this - if this true, I am waiting for a few months till 70ti gets more common at MSRP. Or just settle with 5070 FE.
Not likely it will surpass raytracing of 5070 anway.
 
Remember folks, if +% improvement in performance is paired with a +% increase in price there is no generational uptick.

That is stagnation!

Do not let marketing fool you.

IqL0bgGdcOGj1lQByApX8pfc2wf5VLBV4W2ddoa4OVc.jpg
 
Last edited:
"most gamers by cards under $700"

That's nice, but most by cards under $500, too. I'm not saying this should be under $500. I'm saying I don't accept these prices as the "affordable mid-range."
 
Last edited:
It's been over before, but it's never been this over. The only saving grace is that the majority of games you need a card like this to run are not fun to play.

Going back to my ULTRAKILL replay after the big engine update. Good luck with the puddles fellas.
 
If these prices are accurate, I'm simply going to hold onto my current card (3080) for another gen. Despite everyone saying that 10 GB isn't enough, I've only ran into problems with 1 game (Indiana Jones), and lowering 1 setting cleared that up. I don't play at 4k, so I guess that helps.
 
If these prices are accurate, I'm simply going to hold onto my current card (3080) for another gen. Despite everyone saying that 10 GB isn't enough, I've only ran into problems with 1 game (Indiana Jones), and lowering 1 setting cleared that up. I don't play at 4k, so I guess that helps.
Yep that helps a lot. I play at 4k so have to do more workarounds but so far there has always been a way to play new games while still looking fantastic, it's just making sure certain vram heavy effects are toned down.
 
Status
Not open for further replies.
Top Bottom