AMD Radeon 9070XT review thread

winjer

Gold Member
what year is it GIF


I hadn't heard a color fight since ATI mach series in mid 90's FFS

CBcBrze.jpeg

I think the reason why some people see that difference is because AMD always sets RGB to Full. While nvidia sometimes sets it to RGB Limited.
So its recommended for people with nvidia to check in the control panel if its set to Full or Limited.

There might be another reason. The AMD control panel, allows to do an srgb clamp. While the nvidia doesn't. People would need to use novideo app.
 
Last edited:

MikeM

Member
I think the reason why some people see that difference is because AMD always sets RGB to Full. While nvidia sometimes sets it to RGB Limited.
So its recommended for people with nvidia to check in the control panel if its set to Full or Limited.

There might be another reason. The AMD control panel, allows to do an srgb clamp. While the nvidia doesn't. People would need to use novideo app.
My recent experience on my LG C1 showed better colors with AMD than Nvidia (5070ti). Yes, I enabled 10bit RGB enhanced etc and it still didn’t touch what AMD GPUs offer out of the box. I’m wondering if its a compression thing on Nvidia cards.

And apparently if you tell the Nvidia app to stop optimizing your games (i.e. messing with your settings) it will disable HDR.
 

AGRacing

Member
My recent experience on my LG C1 showed better colors with AMD than Nvidia (5070ti). Yes, I enabled 10bit RGB enhanced etc and it still didn’t touch what AMD GPUs offer out of the box. I’m wondering if its a compression thing on Nvidia cards.

And apparently if you tell the Nvidia app to stop optimizing your games (i.e. messing with your settings) it will disable HDR.
After enabling 10 bit color, turn off the setting in windows "Automatically manage color for apps" under System > Display > Color management. It will already be off in 8 bit and will only activate after switching to 10 first.
 

Bojji

Member
My recent experience on my LG C1 showed better colors with AMD than Nvidia (5070ti). Yes, I enabled 10bit RGB enhanced etc and it still didn’t touch what AMD GPUs offer out of the box. I’m wondering if its a compression thing on Nvidia cards.

And apparently if you tell the Nvidia app to stop optimizing your games (i.e. messing with your settings) it will disable HDR.

This is not true.

My experience with AMD and Nvidia is that colors are the same with the same settings.
 

MikeM

Member
After enabling 10 bit color, turn off the setting in windows "Automatically manage color for apps" under System > Display > Color management. It will already be off in 8 bit and will only activate after switching to 10 first.
What does it do?
This is not true.

My experience with AMD and Nvidia is that colors are the same with the same settings.
My eyes didn’t lie to me. What HDR panel are you using to compare?
 

MikeM

Member
LG B2. With these settings.

In 2023/24 I had 5700XT, 3060ti, 6800, 3080ti

Image looks the same, just like it looks the same from PS5 and XSX.
My picture on PC is better than my Pro and I used the same settings. For example, Warzone looks noticeably flatter than on my PC.

I’ll verify tonight but my TV settings shouldn’t change the picture between different GPUs.
 

Wolzard

Member

Bojji

Member
My picture on PC is better than my Pro and I used the same settings. For example, Warzone looks noticeably flatter than on my PC.

I’ll verify tonight but my TV settings shouldn’t change the picture between different GPUs.

Consoles use basic standard output settings, just like GPUs. Same is true for BD players and other stuff.

PS5 outputs 12bit/RGB 4:4:4/Full range on default settings, Xbox has more options but it's generally the same stuff on HDMI 2.1.

Difference is that tv will treat PC and a console differently, with PS5 you will 4:2:2 no matter what console is outputting, with PC tv shows full 4:4:4 picture.

You can of course set tv to show full 4:4:4 with consoles by:

- changing input name to "PC" for a console (old firmwares)
- changing this option in hdmi settings (newer firmware, I'm not sure C1 got this):

6TcXr21.jpeg
 

MikeM

Member
It washes out the picture.
Ah shit. I didn’t know I had to do that too. Guess I missed that in the manual. Never had to do any of that on AMD cards.
Consoles use basic standard output settings, just like GPUs. Same is true for BD players and other stuff.

PS5 outputs 12bit/RGB 4:4:4/Full range on default settings, Xbox has more options but it's generally the same stuff on HDMI 2.1.

Difference is that tv will treat PC and a console differently, with PS5 you will 4:2:2 no matter what console is outputting, with PC tv shows full 4:4:4 picture.

You can of course set tv to show full 4:4:4 with consoles by:

- changing input name to "PC" for a console (old firmwares)
- changing this option in hdmi settings (newer firmware, I'm not sure C1 got this):

6TcXr21.jpeg
Yeah both my PC and consoles are all set to PC mode. Isn’t PS5 limited to 4:2:2 because of the 32gbps HDMI out? When I play COD in 120fps mode the TV shows 4:2:2, but then swaps back to “RGB 12B 4L8 HDR10” in 60fps modes.
 

Bojji

Member
Ah shit. I didn’t know I had to do that too. Guess I missed that in the manual. Never had to do any of that on AMD cards.

Yeah both my PC and consoles are all set to PC mode. Isn’t PS5 limited to 4:2:2 because of the 32gbps HDMI out? When I play COD in 120fps mode the TV shows 4:2:2, but then swaps back to “RGB 12B 4L8 HDR10” in 60fps modes.

Yeah it's 4:2:2 in 120hz modes on PS5.

For COD could it could explain differences you are seeing on PC vs. PS5. But difference in chroma outside of small text is not really that big, 4:2:2 is good enough for most stuff (most movies are 4:2:0 even).
 

M1987

Member
Anyone else grab the latest Optiscaler? In the setup .bat file there is now a 3rd question asking if you are going to use DLSS inputs. Not sure what it does exactly, but in Outlaws I said Yes. Outlaws saw my 9070 XT as a 4090 and I used DLSS and it worked fantastic.
Cyberpunk also sees my 9070XT as a 4090 in benchmarks
 
Last edited:
I think the reason why some people see that difference is because AMD always sets RGB to Full. While nvidia sometimes sets it to RGB Limited.
So its recommended for people with nvidia to check in the control panel if its set to Full or Limited.

There might be another reason. The AMD control panel, allows to do an srgb clamp. While the nvidia doesn't. People would need to use novideo app.
You are correct. I saw these problems on my nvidia GPUs as well. Sometimes fullrgb settings in nv control panel change to limited (but only when I use HDMI connection) and I also use no video app to get accurate SRGB colors on my PC monitor.
 

CrustyBritches

Gold Member
Cyberpunk also sees my 9070XT as a 4090 in benchmarks
When I use DLSS inputs for Cyberpunk it has a shimmering/pixelated look. Any clue why it might be doing that to me? Robocop seems to work fine with. I've just been using FSR3 inputs for Cyberpunk and it seems to work fine.

Only negative about my 9070 XT is that it is damn loud while in game
What card do you have?
 
Last edited:

StereoVsn

Gold Member
I got ASRock and Microcenter because that was the only MSRP card available. The only other option was Gigabyte. Strangely these two had the most stock.

Is it coil whine or fans?

My card is decently quiet but I played with the fan curve a bit. Fans will still spin up and can get fairly loud.
 
Last edited:

dgrdsv

Member
What does it do?
It forces apps which aren't using Windows color management (which is like 99.9% of apps including games) to sRGB color space even on WCG display devices.
If an app does color management then it shouldn't do anything to it. It also doesn't work when you're in HDR mode of the display.
So all in all it forces the apps which weren't made WCG aware to show proper colors on WCG displays in SDR.
Otherwise such apps are showing varying degrees of oversaturation (for reds and greens usually) as they map their output to display's WCG expecting it to be sRGB while it's not.
Generally I would advise against disabling it as proper colors are better than overblown oversaturated ones. But YMMV as it also depends on a display you're using.
 

Topher

Identifies as young
I got ASRock and Microcenter because that was the only MSRP card available. The only other option was Gigabyte. Strangely these two had the most stock.


Is it coil whine or fans?

My card is decently quiet but I played with the fan curve a bit. Fans will still spin up and can get fairly loud.

Definitely fan noise. It is only loud when it spins up during games. More noticeably than my Asus 4080 Super. Not a big deal since I play with headphones on. I may check the fan curve and see if they the RPMs are crazy high for some reason.
 

M1987

Member
When I use DLSS inputs for Cyberpunk it has a shimmering/pixelated look. Any clue why it might be doing that to me? Robocop seems to work fine with. I've just been using FSR3 inputs for Cyberpunk and it seems to work fine.


What card do you have?
I used FSR3 in the game settings then FSR 3.1.3 in optiscaler

 
Last edited:
Whilst we're on the subject, thoughts on the color enhancement setting? I've always wondered about Vivid Gaming but worried it crushes blacks, which is half the point of having an OLED.
 

Boo Who?

Member
Cyberpunk also sees my 9070XT as a 4090 in benchmarks
I updated CP2077 to the new Optiscaler and it also shows my card as a 4090. Also now lets me use DLSS Transformer mode. Huge improvements in performance. Running everything on High with Psycho RT. Consistently in the 90's fps wise (no frame gen). Turning on path tracing gets me in the 60's. I'm running 1440p UW.
 

winjer

Gold Member

Yeston's social media accounts have alerted potential customers to re-stocks and connected developments—their latest bulletin hints about an improved situation, following another swift depletion of refreshed stock: "hello everyone! Thank you for the support! We have received a lot of messages and would love to inform you now the supply is unstable, but we will restock every week. Please don't be frustrated if you didn't get it. The supply will become stable and continue to be available after April." Interestingly, this morning's message did not touch upon the controversial topic of price hikes. At launch, Yeston's latest Navi 48 GPU-based offerings conformed or floated just above Team Red baseline MSRP (including VAT)—4999 RMB (~$686 USD) for XT, 4499 RMB (~$617 USD) for non-XT—likely boosting demand around that time. Last week, AMD board partners in Japan expressed concerns about current supply constraints—GPU market share in that region had climbed to ~45%, due to the popularity of RX 9070 Series graphics cards. Team Red could lose ground if GPU allocation limitations continue.
 

StereoVsn

Gold Member
Anecdotally, I have seen more restock of Nvidia 5000 series at local Microcenter than AMD 9070XT since I was able to grab my card couple weeks back.

Edit: Forgot to add that Nvidia cards generally haven’t been at MSRP, and MSRP cards sell out immediately. It’s also for 5070 - 5080, 5090 is nowhere to be found.
 
Last edited:

MikeM

Member
Definitely fan noise. It is only loud when it spins up during games. More noticeably than my Asus 4080 Super. Not a big deal since I play with headphones on. I may check the fan curve and see if they the RPMs are crazy high for some reason.
Silent bios switch? I did that and its basically silent.
 

MikeM

Member
Silent bios switch. What is that?
Some cards have a Performance bios and Silent Bios switch. My Asus TUF has the switch- I flipped it on the card and now its far quieter.

 
Last edited:

Topher

Identifies as young
Some cards have a Performance bios and Silent Bios switch. My Asus TUF has the switch- I flipped it on the card and now its far quieter.


Ok.....googling my card I'm seeing sites saying it has a bios switch. So.....silent vs performance, does that just mean lower RPMs which results in higher temps which could affect performance?
 

Reizo Ryuu

Gold Member
does that just mean lower RPMs which results in higher temps which could affect performance?
no it should just be a bit slower than the OC bios, you can also jsut adjust the fan curve in amd cp yourself.
Also it seems you just got the loudest option:
All 9070XT‘s with silent bios should be really quiet. PowerColor and Sapphire, Asus TUF, ASRock Taichi are really silent. Gigabyte seems a little bit louder and XFX has the loudness-crown.
 

MikeM

Member
Ok.....googling my card I'm seeing sites saying it has a bios switch. So.....silent vs performance, does that just mean lower RPMs which results in higher temps which could affect performance?
For my card, it still pulls the same 330w. Its just runs a bit warmer. Its still far below max spec so i’ll take the quiet considering my tower is right beside me.
 

Topher

Identifies as young
For my card, it still pulls the same 330w. Its just runs a bit warmer. Its still far below max spec so i’ll take the quiet considering my tower is right beside me.

How are getting the number of watts? Mine is only pulling around 300w per Afterburner.
 

CrustyBritches

Gold Member
I uninstalled Optiscaler from the Cyberpunk folder and reinstalled it with the 'use DLSS inputs' option. Previously when I used that option I selected CNN DLSS model in-game and it was glitchy. Like sparkly film grain covering everything. This time I selected Transformer model and it doesn't do that anymore. What's weirder is that now even using the CNN model option it works fine. Brought up the Optiscaler menu with 'insert' key and it looks like it might be using XeSS with DLSS inputs. When I switched it to FSR it crashed, so I guess you can't use DLSS inputs with FSR output in this case? FSR 3 input and FSR 4 output works fine, of course. Didn't have a chance to test them and see if there's any difference.
 

M1987

Member
I updated CP2077 to the new Optiscaler and it also shows my card as a 4090. Also now lets me use DLSS Transformer mode. Huge improvements in performance. Running everything on High with Psycho RT. Consistently in the 90's fps wise (no frame gen). Turning on path tracing gets me in the 60's. I'm running 1440p UW.
On a 9070XT?
 
Top Bottom