• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

GymWolf

Gold Member
Does AMD offer the ability to purchase directly from their website?

Unfortunately I feel like we're all going to be ripped off with GPU prices.
I'm obviously not gonna buy a card that cost 1000 dollars for 1500 euros.

1100 euros is the max i can go and i'm alredy stretching...

And i need to see a lot of non-manufactured cherry picked benchmarks in 4k with and without fsr3 and with ue5 engine demos before even thinking about spending that amount of money on a 7900xtx.
 

KungFucius

King Snowflake
Everything else checks out, but wait for reviews/independend benchmarks, amd did some sheaneningans with their graphs this time(different cpu for different gpus) so just to be sure better wait, its not cheap impulse buy territory purchase.

Said all that even 20% lower raster in 4k vs 4090 when not cpu constrained will still be great result/amazing deal if its actually 1k bucks at retail(same thing here, dont compare msrp vs msrp but actual price u can get those cards online or at ur local area, might be way different vs msrp depending on where u live/if u are lucky with avaiability/buy early-, for example here in europe u cant get even worst aib 4090 models below 2500euro currently, hell dunno if u can get them below 3k euro even;/).
The problem with GPUs is if you wait, it is harder to get them for weeks. I was able to purchase 3 4090s within the first 8 days then there was next to nothing for over 2 weeks. I cancelled 2, kept the Amazon one while still trying for an FE and now that is delayed and there were no FEs in my region (central MD) this week.

There will be a lot of AMD cards available on the 13th of December and maybe a small drop before or right after xmas, and then it is going to be slow to start the new year. Last gen the AMD AIBs were fucking obnoxious. The reference cards were MSRP but the AIB non reference ones were ridiculous. The 6800XT which was supposed to be $650 was 800+ for the same cooler designs that were on $700 3080s. AMD lets their AIBs be cunts at launch whereas NVidia makes all AIBs put out some cards that are MSRP. Obviously the MSRP is higher, but it looks better when Gigabyte is charging MSRP for an Eagle Nvidia card then it does for them charging MSRP +20% for an Eagle AMD card. If the AIBs price the XTX at 1200 they will complete throw away the pricing advantage and might cause some to buy the 4080 if it is available.
 

KungFucius

King Snowflake
Does AMD offer the ability to purchase directly from their website?

Unfortunately I feel like we're all going to be ripped off with GPU prices.
They do allow you to. It was a shitshow in 2020 and I think now they have random queues that prolong the failure but reduce the frustration of timeouts.
 
These AMD raytracing videos are technical, but can anybody extrapolate how this can be applied to games using raytracing for RDNA 3?






Toms Hardware Article on AMD hybrid Raytracing Patent

U4hcS1N.jpg







Sony Patents their own Raytracing method
 
Last edited:

PeteBull

Member
The problem with GPUs is if you wait, it is harder to get them for weeks. I was able to purchase 3 4090s within the first 8 days then there was next to nothing for over 2 weeks. I cancelled 2, kept the Amazon one while still trying for an FE and now that is delayed and there were no FEs in my region (central MD) this week.

There will be a lot of AMD cards available on the 13th of December and maybe a small drop before or right after xmas, and then it is going to be slow to start the new year. Last gen the AMD AIBs were fucking obnoxious. The reference cards were MSRP but the AIB non reference ones were ridiculous. The 6800XT which was supposed to be $650 was 800+ for the same cooler designs that were on $700 3080s. AMD lets their AIBs be cunts at launch whereas NVidia makes all AIBs put out some cards that are MSRP. Obviously the MSRP is higher, but it looks better when Gigabyte is charging MSRP for an Eagle Nvidia card then it does for them charging MSRP +20% for an Eagle AMD card. If the AIBs price the XTX at 1200 they will complete throw away the pricing advantage and might cause some to buy the 4080 if it is available.
I said wait for reviews/independend benchmarks tho, they will show up usually day before official launch so 0 worries there, u check ur fav techsite/techtuber( very likely do it with 3+ sources, just incase) then u order it if u like value proposition
 
  • Like
Reactions: GHG

GHG

Member
First of all, the difference here for a clear better card in the Canadian market at least is 350$ US. not 2000$. there are different numbers to swallow depending on one person to another. If 4090 is 2000$ and AMD is 1000$, that is a big game that one might say well fuck that is too much now. double the price .. is it really worth it ( to you and me probably not but to others sure why not they shit money )


However, its all matter of opinion at the end of the day and I respect yours. even if I go by your analogy if I like bikes Crayon and I ride one every day like I do for my video card? bet your ass I will buy the expensive better bike. it's a hobby that I love, It's also something I want to enjoy riding. same reason why people buy BMW and others buy a Kia Rio.

If I have the money. I enjoy cars.... why buy a slower and less feature car?

This is all in line and basically exactly what you said. different segments of people. but when it comes to 350$? I dunno I do not think anyone who is going to be spending 1000$ on a card is in a different segment from the guy who is spending 350$ on top ( at least in Canada, in US it's a 600$ difference and It's really hard to swallow), but then again if the AMD card is same level performance as a 4080, exact same performance in rasta, Ill still buy an Nvidia card for 200$ and get better ray tracing and DLSS support. that is just me.


Funnily enough once all is said and done the 4090 is likely to be double the price of the 7900 XTX where I live. Nvidia are treated like apple here so their products are priced accordingly by local resellers. I've always had to import my GPU's from the US for this reason.

I said wait for reviews/independend benchmarks tho, they will show up usually day before official launch so 0 worries there, u check ur fav techsite/techtuber( very likely do it with 3+ sources, just incase) then u order it if u like value proposition

Yep we always get reviews prior to the actual release date (and no pre orders) so I don't see any reason for anyone to worry. Just chill for the next month, we will also get news/reviews (and maybe even a repricing if Nvidia have any sense) for the 4080 16GB in the meantime.
 

Loxus

Member
The 375W limit is a physical limit on the card.
A vBios flash wont magically bend the laws of physics.
If you wanted to go above that you shouldnt be buying the reference design anyway.

AIBs already have 3x8pin cards in the works.
So those have ~500W power limits.

ASUS-RX7900-TUF-2.jpg
Apparently, RDNA 3 can indeed exceed 3GHz if this is true.
6W5oQhy.jpg


My guess is cards from Asus, etc. will have 3 GHz+ clocks without overclocking and would mostly likely be larger than the AMD cards with adequate cooling.
 

octiny

Banned
Can anyone explain to me why they went from 7nm to 5nm and REDUCED the clockspeeds to 2.3 GHz? The 6900xt used to regularly hit 2.5 ghz.

Especially when nvidia went from 2.0 ghz to 3.0 ghz when going from 8nm to 4nm. I was expecting clocks to hit 3.0 GHz and 100 tflops. 61 is impressive but way below what the rumors were indicating. The total board power is also very conservative. If Nvidia is willing to go to 450-550 watts, why are they locking themselves to slower clocks and just 350 watts?

I really wonder what happened here. poor performance at higher clocks? logic failure? poor thermals? Even with 2.5 ghz they couldve beaten the 4090.

Level1tech live stream said AMD hinted to them that AIB's are going to be the ones that really push these cards from factory as they have actual headroom.

Regardless, doesn't matter if the vBios limit is 355w on the reference model. You'll be able to easily bypass it to 450w+ via Igor's MorePowerTool which overrides PowerPlay limits via it's registry regardless of it's 2 8 pin just like the 6900 XT. Think of MPT as a temporary vBios edit. 2x 8 Pin+ PCIE bus can handle it no problem w/ standard guage wires. Plenty on Overclock.net have pushed their 6900's to 500w 24/7 w/ no issues on dual 8 pin, myself included when I had one. PEG has said 8 pin cables can safely handle up to 288w. Why they rated them only for 150w had more w/ accounting for below par PSU's flooding the market at the time specifically ones w/ multiple 12v rails equating to lackluster amperage & shoddy wiring.

Furthermore, 6900 XT reference has a 2.15ghz game clock & 2.25ghz boost clock w/ actual in-game clocks around 2.37ghz. AMD is playing conservative, I'd expect somewhere around 2.6ghz-2.7ghz out of the box on the XTX reference editon in games. Either way, it's higher, not lower than the 6900 XT as the core clock is now rated at 2.5ghz instead of 2.25ghz. I wouldn't be surprised to see it push past 3.1ghz OC w/ an extra 100w supplied to it. 6900 XT (XTXH version) & 6950 XT's could hit 2850mhz via MPT w/ 100w more. Also, the 4090 in-game clocks are around 2.7ghz, not 3ghz (only when OC'ed), rated speed are 2.23ghz-2.52ghz.

Saw this in the Jayz two cents video. Didnt realize they had a 50% IPC gain for ray tracing performance. The RE8 results dont really indicate that level of performance uplift. In the castle, it runs at around 110-120 fps. outdoors 80 fps. are they using the outdoor comparisons or indoor?

Definitely need to see whats going on here, but im hopeful again. Especially if both Sony and MS use the RDNA3 design for their midgen refresh.

g2RSYkF.jpg

It's on par w/ a 3090 Ti in RT, give or take a little. Obviously one can only extrapolate the numbers so much w/ what's been shown so far. 6950 XT had slightly better than 3070 Ti RT, so if it's 50% better that puts it around 3090 Ti territory. Using 3DM's newest RT benchmark, Speedway as reference (partial to both). AMD has been pretty conservative when it comes to IPC claims since Ryzen & RDNA was introduced, either meeting or somtimes exceeding. We'll see.

I know. I think RDNA3 is comparable only to 8nm RTX3000 series. And we will see it shortly.

Do you mean RT performance relative to the 3090 series? If yes, then I agree. If you are talking rasterization then you're talking out of your ass as the 6950 XT was already damn close to the 3090 Ti.

Also regarding your other posts, if you're going to talk about potential power savings on a 4090 then you need to do the same on the 7900 XTX/XT. You could easily save upwards of 100w undervolting w/ an OC via MPT @ 4K while maintaining a higher in-game boost clock than stock on the 6000 series. Besting the 3000 series w/ ease in efficiency. I don't see that changing w/ RDNA 3.

Funny enough, it also makes you wonder why Nvidia pushed the 4090 to 450w vs 350w if you could simply lower the power limit (not undervolting) to regain most of it's performance. My guess is they knew what AMD had & didn't want to lose the rasterization battle in the same way the 3090 did to the 6900 XT so they brute forced the extra 5%-10%.

Yes, the RX 7900XT is really not attractive enough for 100USD less. Wish they would have scapped that completely and use their production capacities only for the XTX. Now we get less stock, because the also have to build the other card.

Agreed, but it comes down to yields. The defective 7900 XTX's dies need to be used somehow instead of artificially disabling more of the die & calling it a 7800 XT w/ lower margins. That is until yields improve of course. IMO, I think they will drop the XT to $849 before launch once feedback comes in or quickly after. $100 to upgrade to the XTX is a no brainer, $150 makes you think a little. With the eventual 7800 XT coming in at $699.

AMD cards are trash both hardware and software wise, don't buy that.

3yPNex2.gif


Why would they launch a supposed flagship card in December only to outclass it a few months later?

I mean there's precedence w/ both companies but I agree, 3 months later is not happening.

However, my spider sense tells me we'll get a 3D 7950 XT if the 4090 Ti comes mid next year & further 3D refreshes down the product stack this time next year trickling out.

And when I say spider sense, I mean journalists who attended the Q&A/briefing where AMD might have or might not of alluded to something of that sorts being a possibility (NDA).

Is it easy to remove what they did in the vanilla card? Like do i need to physically do something on the card or just unlocking something via software?

No, you do not need to shunt mod it or anything. Yes, it'll be easy. I don't see it being any different than RDNA 2 in regards to removing PowerPlay power limits w/ Igor's MPT (supports both RDNA 1 & 2). The 6900 XT/6950 XT allowed unlimited wattage w/ the 6800 XT maxing out at 375w on reference (280w stock). The best part about MPT is how easy it is to reverse anything you do if you're feeling uncomfortable.

Refer to my 2nd paragraph at the very beginning of this post for more info on it.

Using MPT is also the only way to truly undervolt RDNA. If you undervolt while overclocking it at the same time via AMD's Adrenaline software it'll ignore the undervolt & apply the normal voltage curve, even if it shows the correct UV voltage in Adrenaline. You can only do one or the other in Adrenaline. MPT allows you to actually limit the voltage & still OC. For example I could hit around 2550mhz in-game clock @ 4K @ just 1025mv via MPT but only 2375mhz using Adrenaline w/ no OC. So what I & others who know about this do is you simply undervolt via MPT (only have to do it once). You then change the core clocks to whatever you want in Adrenaline, & bam no more resetting of the voltage curve when OCing while undervolting. You are now consuming 75w less than stock w/ 175mhz higher core clocks.
 
Last edited:
Sorry if old news but I just read this. FSR3 might work on older GPUs as well. DLSS3 would probably work on older ones too if Nvidia cared to add it.

https://videocardz.com/newz/amd-con...3-may-be-supported-by-pre-rdna3-architectures

PCWorld was also interested to know more about FSR3, the next-gen upscaling technology from AMD. It was announced during the RDNA3 showcase. AMD confirmed this technology will come out in 2023, but nothing specific on timing or GPU support was said. So another burning question from the community was whether FSR3 will be supported by other architectures than RDNA3. Frank Azor confirmed AMD wants FSR3 to be supported by more than just RDNA3:

[AMD FSR3] is not a reaction or a quick thing [to DLSS3], it is absolutely something we have been working on for a while. Why is it taking a little bit longer for it come out, that you’d probably hoped for? The key thing to remember about FSR is the FSR philosophy and FSR until now did not just work on RDNA2 or RDNA1 they work on other generations of AMD graphics cards. They also work on competitors graphics cards. It is exponentioally harder than if we just made it work on RDNA3. […] We really do want to work on more than just RDNA3.
— Frank Azor to PCWorld
Azor continues to say that supporting one architecture is easy (hinting at DLSS3 on RTX 40). But AMD made a promise with FSR technology to support many architectures, and this takes time. The company cannot confirm 100% yet if the company will be able to fulfil the promise to support more architectures with FSR 3, but they are ‘trying to’.
 
I've never used dlss first hand and I only have one game with FSR 2.1. but, I must say the image on that game with quality 1440p is great.

Yeah. With the starting price on the 7900XTX it seems like the 7700 line will still be priced reasonably. That might be the one that makes me jump to team red.
 

SolidQ

Member
RT will be improved with drivers, maybe up 15%. More interesting diagram show up 1.8 at 2.5ghz, wonder how many performance will give at 2.7/2.8 ghz
 
I don’t think AMD confirmed they’re using new hardware to accelerate their RT pipeline, I’m guessing it’s still a similar approach to what they did with RDNA 2 and repurposed TMU’s, and a lot of the gains in RT are coming from brute force in shaders and such.
 

//DEVIL//

Member
Funnily enough once all is said and done the 4090 is likely to be double the price of the 7900 XTX where I live. Nvidia are treated like apple here so their products are priced accordingly by local resellers. I've always had to import my GPU's from the US for this reason.
Exactly why I said in Canada. Every country is different . I was comparing if I had to import the reference amd card because not available in Canada vs FE which is something of a unicorn but you can still buy it in Canada.

My brother lives in Dubai, the 4090 is 2500$ US for a generic card. now that is way too much so he import it from Canada or US as well
 
Last edited:

hlm666

Member
RT will be improved with drivers, maybe up 15%. More interesting diagram show up 1.8 at 2.5ghz, wonder how many performance will give at 2.7/2.8 ghz
Not sure where your getting 15% more performance in RT because of driver improvements but sure ok why not, so we may aswell remember nvidia getting up to 30% more RT performance once stuff starts coming with SER support aswell.
 
Not sure where your getting 15% more performance in RT because of driver improvements but sure ok why not, so we may aswell remember nvidia getting up to 30% more RT performance once stuff starts coming with SER support aswell.
I'm curious how much developers time will be needed to support SER.

Kudos to Nvidia because it's a cool feature, they also pioneered Mesh Shaders which are a game changer but developers have been pretty lazy in implementing it.
 

FireFly

Member
how so many people will be disappointed... didn't AMD say the performance is UP to 1.7 %? it means it can be 1.7 on ninja turtles while 1.2 on a cyberpunk. why everyone is assuming all the games now are 1.5 to 1.7 performance?
1.7 was for Cyberpunk in rasterization. Other titles were 1.5

https://www.anandtech.com/Gallery/Album/8202#31

Obviously different benchmarks will give different results, so it's dubious applying AMD's quoted increases to random 4090 reviews.
 

LiquidMetal14

hide your water-based mammals
I'm honestly more eager to get to the release and reviews just so hyperbole and misinformation ceases and we can see what we can do with oberclodking and stuff like thermals.

If the extra 13tflop OC on my 4090 is anything to go by then it will.be fin to see how the XTX will do.
 

Panajev2001a

GAF's Pleasant Genius
Having relatively close to RTX4090 performance overall at 100W less which may avoid the need to change the power supply, $600 less, a much smaller card which means less of a need to change your case (reducing cost and being less of a pain), etc… these cards seem quite well positioned.
 

LiquidMetal14

hide your water-based mammals
Having relatively close to RTX4090 performance overall at 100W less which may avoid the need to change the power supply, $600 less, a much smaller card which means less of a need to change your case (reducing cost and being less of a pain), etc… these cards seem quite well positioned.
Keep in mind that the more you inform yourself about how nvidia overspec'd the cooler then you can see how the wattage isn't exactly how it seems.

Granted, you can OC and add more power and hit the 100tflops that wccftech did but it's the users choice and silicone lottery as to what you can get out of each.

My card is OC'd and I add another 190 to the core and 1500 to the mem and runs around 450w average.

I can get it 100w less while sacrificing performance but that's now why I bought this card.
 
Last edited:

twilo99

Member
Sure and be tri-dash all-day every day been with amd since the ATI days so unlike you I have history.

I’m aware they’ve had driver issues in the past, and they may have some in the future, but as of today, their drivers are rock solid, at least on my system running a 6800xt.
 

twilo99

Member
seeing how hard they are pushing FSR , does that mean we would start to see more FSR 2.1 games on consoles? They would benefit a lot from it..
 

winjer

Gold Member
seeing how hard they are pushing FSR , does that mean we would start to see more FSR 2.1 games on consoles? They would benefit a lot from it..

They already made the software open source. If devs don't use it it's because they don't want to.
UE4 has TAAU and UE5 has TSR.
But most game engines have nothing to upscale on consoles. So FSR 2.2 would be great to get some extra performance out of consoles.

There are already several games that have FSR 2.x on PC, but the consoles version has nothing. Really strange that some devs have done all the work, but fail to do that one extra step.
I think only Scorn is using FSR 2.0 on consoles at this moment. Sadly.
 

Haint

Member
Apparently, RDNA 3 can indeed exceed 3GHz if this is true.
6W5oQhy.jpg


My guess is cards from Asus, etc. will have 3 GHz+ clocks without overclocking and would mostly likely be larger than the AMD cards with adequate cooling.

Why in the world would AMD sit on 1Ghz of overhead when such clocks would (supposedly) put them well ahead of 4090 raster across the board without breaking a sweat? Even if it was just AIB models, this is the kind of thing they would have put banners on rooftops for. If it was only binned chips, they would have added another product tier to the stack using the best chips at like $1400+. Some of you guys live in Candyland. In reality, it's probably a limitation of the chiplet design.
 
Last edited:

Loxus

Member
Why in the world would AMD sit on 1Ghz of overhead when such clocks would (supposedly) put them well ahead of 4090 raster across the board without breaking a sweat? Even if it was just AIB models, this is the kind of thing they would have put banners on rooftops for. If it was only binned chips, they would have added another product tier to the stack using the best chips at like $1400+. Some of you guys live in Candyland. In reality, it's probably a limitation of the chiplet design.
Take it up with AMD, they the ones that supposedly said RDNA 3 exceeds 3GHz.
CK1U4RM.jpg


Who knows, maybe they have a 7950XT that exceeds that 3GHz mentioned.
 

Crayon

Member


RT seems fine perfomance, especially when drivers will add like 10-15% perfomance


That actually looks in line with the price. Not mad at that. I'm skeptical of these projections tho. They should be in the ballpark but I don't know.

Where is our midrange cards from previous generations? Even with inflation, these $899, 999 and $1599 (nvidia) are out of reach for the vast majority of buyers. I want to see a 4060 or 7880xtx at 40tf for $399 or $499.

Hopefully they prices will be good on the mid range. Ampere and rdna2 are going to be out there for awhile tho and the value of those has to be considered. I think there are 6700xt's for mid $300's and 3060ti for mid-400's right now. So the various lower tier cards coming are going to have to play with those. When the 7600xt comes out, it could be against a $300 6700xt so what then.

I'm thinking about this stuff too much lately lol. I should take a break till the release.


Apparently, RDNA 3 can indeed exceed 3GHz if this is true.
6W5oQhy.jpg


My guess is cards from Asus, etc. will have 3 GHz+ clocks without overclocking and would mostly likely be larger than the AMD cards with adequate cooling.

This is such a big x factor. If amd really did leave that much on the table for the reference, that is really interesting. I think it's smart to bring out a very nice chip in a sensible package. 999 is still crazy expensive, but 80% of a 4090 is crazy too. And critically may still have a healthy margin.

AMD already said with the product if not with words "we're not going there". In terms of price/power/size of the 4090.

But if aibs *can* go there with it? If aibs can really push closer to 3ghz then they can actually make their $1,300, 500w, shoebox sized versions and have something to show for it. If amd does fine at 999, they would be happy as pigs in shit to sell for $100+. That would allow aibs to compete with each other snapping at the heels of the 4090 because they would have room to meaningfully wring out the chip.

So if it really goes up to 3ghz, I think that could actually turn out to be pretty smart.

Having relatively close to RTX4090 performance overall at 100W less which may avoid the need to change the power supply, $600 less, a much smaller card which means less of a need to change your case (reducing cost and being less of a pain), etc… these cards seem quite well positioned.

They idea that Nvidia wins because AMD didn't come out the gate with a $1,600 card is weird. 4090 is extraordinary and I think Nvidia has every right to charge a hefty premium for it but it's overboard in several ways. 7900xtx is coming out against the 4080 and it looks really good there.
 

CrustyBritches

Gold Member
What does Nvidia offer at the $999 price point? I know they were planning on selling the “4080” 12GB at $899. Wonder how that would have matched up against the 7900XT? Probably not well and that’s why they really pulled it.
 
Last edited:

rnlval

Member
I don’t think AMD confirmed they’re using new hardware to accelerate their RT pipeline, I’m guessing it’s still a similar approach to what they did with RDNA 2 and repurposed TMU’s, and a lot of the gains in RT are coming from brute force in shaders and such.
AMD has confirmed RDNA 3 has the new transverse and box-checking features.

Be aware RX 7900 XTX has about 61 TFLOPS while RTX 4090 FE has about 82 TFLOPS.
 
Last edited:
I desperately want AMD to be competitive again. I'm tired of PC gaming being essentially held hostage because only one chip manufacturer is any good yet charges extortionate prices.

Give it time, AMD and Intel are getting there. I know this is a cliche thing to say, but RDNA 3 has made significant progress and it is nothing to laugh at. ARC is not that bad, and it handles raytracing pretty well and their version of DLSS is even better than FSR 2.0 (don't know about 2.1 or 2.2)

If Intel and AMD can offset raytracing performance to their next gen CPU's products, you can dismantle and negate any proprietary technology NVIDIA has to offer. There are many ways to approach raytracing (or even upscaling for that matter), and NVIDIA's way won't last long in my honest opinion.
 
Last edited:

thuGG_pl

Member
So if the predicted performance will hold true then XTX is probably my next card.
Honestly I feel that there aren't good high end options at this point, nVidia went batshit crazy with prices and I won't play along.
If AMD turns out to be dissapointing then I have no clue what to do, maybe buy cheap used 3080/3090, and wait for better options in the future.

Also I find it funny that a card for 1000$ is being celebrated, imagine that a generation ago :)
 
Last edited:
But if aibs *can* go there with it? If aibs can really push closer to 3ghz then they can actually make their $1,300, 500w, shoebox sized versions and have something to show for it. If amd does fine at 999, they would be happy as pigs in shit to sell for $100+. That would allow aibs to compete with each other snapping at the heels of the 4090 because they would have room to meaningfully wring out the chip.

Absolutely. Plus, it lets the silicon lottery play out.

AIBs can use chips that just meet the specs in non OC models, hopefully paired with smaller coolers and card sizes. While those really good chips that have a lot of room can be used to sell the OC models.
 
Why in the world would AMD sit on 1Ghz of overhead when such clocks would (supposedly) put them well ahead of 4090 raster across the board without breaking a sweat? Even if it was just AIB models, this is the kind of thing they would have put banners on rooftops for. If it was only binned chips, they would have added another product tier to the stack using the best chips at like $1400+. Some of you guys live in Candyland. In reality, it's probably a limitation of the chiplet design.
I think the strategy is clear.They want to be way cheaper while being pretty close to the 4090.This way they will not only combat the 4090 with better futures like 2.1 display but will also combat with a price of 999 the 4080 and 4070.They will destroy 4080 and 4070 while being close to the 4090 for only 999.And they want the wattage advantage.They know how ridiculous the 4090 in size is which needs new case and psu for the owner.This way they will be way cheaper and their 3 party card makers will take care of the more expensive cards that will be more powerful than a 4090 with clocks around 3 GHz.Actually a great strategy they covered with one card 3 other cards of Nvidia.
 

Ironbunny

Member
Yeah, it's always possible that they will be squirreling away the best chips here and saving them for a future variant, maybe with 3D cache or something like that.

Perhaps something like that. I believe there were rumours of AMD releasing the Ryzen 3D chips at CES so 7950XT (3D?) could be a nice addition to that Lisa Su keynote. Would coinside nicely with the new monitors.
 
Last edited:
Why in the world would AMD sit on 1Ghz of overhead when such clocks would (supposedly) put them well ahead of 4090 raster across the board without breaking a sweat? Even if it was just AIB models, this is the kind of thing they would have put banners on rooftops for. If it was only binned chips, they would have added another product tier to the stack using the best chips at like $1400+. Some of you guys live in Candyland. In reality, it's probably a limitation of the chiplet design.
The answer is power.
The V/f curve probably isn't where they want it, and they want to stick aggressively to their Power Target - at least with reference boards.

Inspite of being about 500MHz shy of the target, it still gets spooky close to the 4090 in terms of rasterisation, at 95W less power, and $600 less.
Absolutely smokes the 4080 in terms of price and performance too.
 

Haint

Member
I think the strategy is clear.They want to be way cheaper while being pretty close to the 4090.This way they will not only combat the 4090 with better futures like 2.1 display but will also combat with a price of 999 the 4080 and 4070.They will destroy 4080 and 4070 while being close to the 4090 for only 999.And they want the wattage advantage.They know how ridiculous the 4090 in size is which needs new case and psu for the owner.This way they will be way cheaper and their 3 party card makers will take care of the more expensive cards that will be more powerful than a 4090 with clocks around 3 GHz.Actually a great strategy they covered with one card 3 other cards of Nvidia.
That was my point, if such clocks were possible and such aibs existed, they would have shouted it from the rooftop at their event. Or failing that, aibs would have taken up the baton and would be doing so right now. What you guys are proposing makes no logical sense, letting people buy 4090s for a month while they sit on secret 3Ghz cards.

The answer is power.
The V/f curve probably isn't where they want it, and they want to stick aggressively to their Power Target - at least with reference boards.

Inspite of being about 500MHz shy of the target, it still gets spooky close to the 4090 in terms of rasterisation, at 95W less power, and $600 less.
Absolutely smokes the 4080 in terms of price and performance too.

Aibs have already revealed triple 8 pin 500+W cards with no mention of these monster 4090 beating clocks.
 
Last edited:
I just realized how small the 7900xt is. I could just about cram that in my little ITX case. It might fit actually, I have just a little space beyond the dual slot but not much - it would be tight. A 7800xt might fit though, I had figured there would never be anymore cards that far up the line that would fit.
 
That was my point, if such clocks were possible and such aibs existed, they would have shouted it from the rooftop at their event. Or failing that, aibs would have taken up the baton and would be doing so right now. What you guys are proposing makes no logical sense, letting people buy 4090s for a month while they sit on secret 3Ghz cards.



Aibs have already revealed triple 8 pin 500+W cards with no mention of these monster 4090 beating clocks.
Which means that the V/f curve isn't where they want it to be. They're doing an awful lot of new shit all at once. MCM, new node, completely rearchitected shader design and associated physical design, so things probably didn't go entirely as planned.
As I've said though, they are still getting spookily close to the 4090 in terms of rasterisation with a significantly smaller GPU, with lower than expected clocks. They realised its not as fast as they'd originally designed, so they've priced accordingly, which is perfectly fine.

I would hazard a guess that they will probably respin it with some optimisations like they did with RV570 back in the day, or like how Nvidia did with Fermi between the 480 and 580. And then release that for the 7950XT and XTX.
Navi 33 is next up, but its monolithic, on a node they have a lot of manufacturing experience with - so the physical design should in theory, not be too bad. And given how small and lightweight it should be, should be able to fly in terms of clockspeed.
Navi 32 is very far out at this stage, so I imagine a lot of lessons learned from N31 will be taken back into it before it tapes out and launches.

This is all just guesswork, but based on some reasonable assumptions and information that is partially out there already.
 
Last edited:
Top Bottom