AMD Radeon 9070XT review thread

MikeM

Member
Ok guys hear me out- does Nvidia have worse colors than AMD? Games looked far more vibrant with the 9070xt than the 5070ti.
 

bbeach123

Member
Ok guys hear me out- does Nvidia have worse colors than AMD? Games looked far more vibrant with the 9070xt than the 5070ti.
Was upgrade from r9 270x to 1060 back then and notice the color look a bit washout too . I heard it had something to do with hdmi and driver default dynamic range .
 
Last edited:

Reizo Ryuu

Gold Member
Ok guys hear me out- does Nvidia have worse colors than AMD? Games looked far more vibrant with the 9070xt than the 5070ti.
The last time I used an nvidia card must've been 20 years ago, but I do see reddit and forum threads about this, whereas others say there's no difference, so maybe it's a placebo or maybe something that's actually happening?
One of the reddit threads if the OP switched to "full range" in nvidia cp, is this something you checked?
 

Bojji

Member
Ok guys hear me out- does Nvidia have worse colors than AMD? Games looked far more vibrant with the 9070xt than the 5070ti.

Must be settings. Most devices are pretty standard with this stuff nowadays.

Check out HDMI black level (if you are using HDMI), RGB/ycbcbr444, 8/10/12 bits etc. And display settings.
 

MikeM

Member
Never heard of that one

Must be some settings in control panel. You didn’t DDU?

Was upgrade from r9 270x to 1060 back then and notice the color look a bit washout too . I heard it had something to do with hdmi and driver default dynamic range .
Found the issue it seems- Nvidia control panel had “default settings.” I had to manually set 10bit, RGB and set dynamic range to full.

Wtf Nvidia…
 

AGRacing

Member
5070 ti with OC and 5800x3d for reference.

 
I mainly play at 4K and preferably 4K/120. So 5080 at MSRP is pretty attractive, especially if you take RT into account.

That said, not sure it’s $400 more attractive.
Certainly not. I use a 42 inch LG C2 as a monitor, my old 7900XT handled anything I threw at it with ease. And my 9070XT is 20% faster.

All these cards can handle 4K, this isn't 2019 anymore. I've never played a game that's got close to 20GB VRAM usage.
 

Akuji

Member
Got a 9070xt, did run 3d benchmark.


VRAM seems pretty unstable as soon as i do anything with it.
Overall doesnt boost as good as others ive seen.
Its an ok card but when u OC its fun to get that last % that my card seems like i wont get.
 
Last edited:
Games looked far more vibrant with the 9070xt than the 5070ti.
Might be colour profile bug / digital enhancement fuckery going on. I've had this thing when I tried using HDR on my monitor - everything appeared wrong, washed out at first and then excessively colourful (I disabled/enabled HDR a few times), so I checked "Display" in AMD driver settings and there was "Custom Colour" ticked on for some reason... Weirdly enough, all the sliders were at 0, but just by being enabled this thing affected the colours.
 

CrustyBritches

Gold Member
New 9070 XT Steel Legend came today. Really happy there's no coil whine, that shit drives me nuts. Performance and thermals are good. Now I need a white motherboard and maybe white AIO CPU cooler to complete the look. Gonna test it another day then sell my 4060 Ti 16GB to offset the purchase cost.
Steel-Legend-9070-XT-2.jpg
Steel-Legend-9070-XT-1.jpg
Steel-Legend-9070-XT-3.jpg
 

Akuji

Member
New 9070 XT Steel Legend came today. Really happy there's no coil whine, that shit drives me nuts. Performance and thermals are good. Now I need a white motherboard and maybe white AIO CPU cooler to complete the look. Gonna test it another day then sell my 4060 Ti 16GB to offset the purchase cost.
Steel-Legend-9070-XT-2.jpg
Steel-Legend-9070-XT-1.jpg
Steel-Legend-9070-XT-3.jpg
looks nice, but doesnt the GPU feed its own hot air back into itself?
 

SighFight

Member
EU MSRP is/was ~690 Euro so ~750$.

900 Euro for cards that should be ~700 is fucked up.
There are news reports now that the original msrp was subsidized and likely is not coming back... Basically it was just a scam to get good press. At ~700€ and 850€ these cards are no better deal than what has been around for a while with the 4070 ti and 7900 xt.


All it seems to be a about for the last 2 generations is to take cheaper models off the market to replace them with equally performing cards with higher prices. I repeat myself but I think Nvidia and AMD are killing off PC gaming and will be left wondering where there market has gone.
 

CrustyBritches

Gold Member
looks nice, but doesnt the GPU feed its own hot air back into itself?
Thanks.

Admittedly, I'm not really a pro when it comes to this stuff. The case wall is an open mesh and I can feel cold air flowing into that side, then the warm air shoots mostly out the sides of the GPU and is pulled up by 2x140mm fans on the top. I can feel a bunch of warm air flowing out the top and it can float a piece of printer paper above it. I just put this together tonight, and after you mentioned it, I think I had my CPU cooler on the wrong side since it pulls instead of pushes. I moved it to the bottom now. I think all I'm missing is a 140mm fan on the lower tray to pull more cool air in the through the lower vents and pass it up to the 2x140mm fans out the top.

Like this?
13.jpg
 
Last edited:

dgrdsv

Member
Found the issue it seems- Nvidia control panel had “default settings.” I had to manually set 10bit, RGB and set dynamic range to full.

Wtf Nvidia…
Default settings are set by Windows not Nvidia. They provide an override for that which you've used.
 

Akuji

Member
Thanks.

Admittedly, I'm not really a pro when it comes to this stuff. The case wall is an open mesh and I can feel cold air flowing into that side, then the warm air shoots mostly out the sides of the GPU and is pulled up by 2x140mm fans on the top. I can feel a bunch of warm air flowing out the top and it can float a piece of printer paper above it. I just put this together tonight, and after you mentioned it, I think I had my CPU cooler on the wrong side since it pulls instead of pushes. I moved it to the bottom now. I think all I'm missing is a 140mm fan on the lower tray to pull more cool air in the through the lower vents and pass it up to the 2x140mm fans out the top.

Like this?
13.jpg
ah the side was open on the picture i first commented on. its actually an open panel so the GPU gets fed fresh cold air.
Nah this seems near perfect for me now! Probably great temps!
 

Soodanim

Member
There are news reports now that the original msrp was subsidized and likely is not coming back... Basically it was just a scam to get good press. At ~700€ and 850€ these cards are no better deal than what has been around for a while with the 4070 ti and 7900 xt.


All it seems to be a about for the last 2 generations is to take cheaper models off the market to replace them with equally performing cards with higher prices. I repeat myself but I think Nvidia and AMD are killing off PC gaming and will be left wondering where there market has gone.

Concerning. We will have to see what the restocks look like in 1-2 weeks.
 

MikeM

Member
Man I can’t stand the Nvidia software. Actually hate it. Why does it interfere so much with game settings and trying ti decide settings for you?

I have to be doing something wrong.
 

Bojji

Member
Man I can’t stand the Nvidia software. Actually hate it. Why does it interfere so much with game settings and trying ti decide settings for you?

I have to be doing something wrong.

Just turn automatic optimizations off? I never used this shit in gfe or nvidia app.

For me the most essential settings are in nvidia control panel: 16x AF in global profile and then sometimes vsync on, ultra low latency on for selected games.
 

MikeM

Member
Just turn automatic optimizations off? I never used this shit in gfe or nvidia app.

For me the most essential settings are in nvidia control panel: 16x AF in global profile and then sometimes vsync on, ultra low latency on for selected games.
I googled how and I couldn’t get it to work. Apparently turning off optimizations messes with HDR too per Nvidia
 

Bojji

Member
I googled how and I couldn’t get it to work. Apparently turning off optimizations messes with HDR too per Nvidia

It's here:

bz55uup.jpeg


As for RTX HDR, you set it individually per game. "Game filters and photo mode" must be active for that - Alt+F3 and RTX HDR (and many more) options are there.

For RTX HDR to work you also need to disable auto hdr in Windows 11 (if you are using this version).
 

MikeM

Member
It's here:

bz55uup.jpeg


As for RTX HDR, you set it individually per game. "Game filters and photo mode" must be active for that - Alt+F3 and RTX HDR (and many more) options are there.

For RTX HDR to work you also need to disable auto hdr in Windows 11 (if you are using this version).
It was still doing it so I gave up. Returned the 5070ti and now have my previous 9070xt. I don’t have patience for issues like this.

Thanks for the assistance anyways
 
It was still doing it so I gave up. Returned the 5070ti and now have my previous 9070xt. I don’t have patience for issues like this.

Thanks for the assistance anyways
Never had this problem. BTW. nvidia app is very basic, and created mainly for casuals. Nvidia inspector offers way more options and you can even force the latest DLSS globally without copying DLL filles for each game with DLSS swapper, or tweaking game profiles in nvidia app just to enable DLSS4.

Just turn automatic optimizations off? I never used this shit in gfe or nvidia app.

For me the most essential settings are in nvidia control panel: 16x AF in global profile and then sometimes vsync on, ultra low latency on for selected games.
Agreed.
New 9070 XT Steel Legend came today. Really happy there's no coil whine, that shit drives me nuts. Performance and thermals are good. Now I need a white motherboard and maybe white AIO CPU cooler to complete the look. Gonna test it another day then sell my 4060 Ti 16GB to offset the purchase cost.
Steel-Legend-9070-XT-2.jpg
Steel-Legend-9070-XT-1.jpg
Steel-Legend-9070-XT-3.jpg
Beautiful case, and the card RGB lighting is also awesome too.
 

Xyphie

Member
Ok guys hear me out- does Nvidia have worse colors than AMD? Games looked far more vibrant with the 9070xt than the 5070ti.

This is has been claimed on various forums since it was ATI Radeon. I've yet to see something remotely scientific to back up this claim. It should be extremely easy to show by just doing output captures and comparing to ground-truth but nothing of the kind has ever emerged from someone claiming it over the years.
 
Last edited:

Akuji

Member
Biggest win for amd is that fsr4 + framegen is as good as it is.
Perfectly usable. I feel some Input lag at around 80-100fps native. So with Frame gen around 180-200 but the smoothness is still better overall then Not using it. with Controller i dont feel it at all Even at lower fps. Witcher 3 Max settings around 100fps and it still is Fine. Ff7 rebirth works and Looks Great as well. ( make sure u know how to inject fsr4 and fg with optiscaler )
 
This is has been claimed on various forums since it was ATI Radeon. I've yet to see something remotely scientific to back up this claim. It should be extremely easy to show by just doing output captures and comparing to ground-truth but nothing of the kind has ever emerged from someone claiming it over the years
Agreed.


kWEsKDJ.jpeg


Frogboy, the biggest AMD fan on YT claimed AMD cards had better colors, but now he appoligized people, because he found out that wasnt really the case. He was simply using incorrect settings without realizing it.



ATI had a little bit better picture quality 20 years ago (analog VGA times), but now now we have display port (digital signal) and RAMDAC no longer make a difference. Both AMD and Nvidia however use delta color compression to save memory bandwidth, and then we also have DSC compression on top of that, so who knows, maybe there are are some scenarios when small imperfections can be seen.
 
Last edited:
Biggest win for amd is that fsr4 + framegen is as good as it is.
Perfectly usable. I feel some Input lag at around 80-100fps native. So with Frame gen around 180-200 but the smoothness is still better overall then Not using it. with Controller i dont feel it at all Even at lower fps. Witcher 3 Max settings around 100fps and it still is Fine. Ff7 rebirth works and Looks Great as well. ( make sure u know how to inject fsr4 and fg with optiscaler )
I wonder if FSR FG works better on AMD cards. On my 4080S FSR FG is absolutely unusable. FSR FG has judder (motion isnt perfectly smooth), and input lag difference is quite big. I can aim more easiy using DLSS FG at 30fps base (60fps with FG) compared to FSR FG at 100fps base (200fps with FG).
 

Akuji

Member
I wonder if FSR FG works better on AMD cards. On my 4080S FSR FG is absolutely unusable. FSR FG has judder (motion isnt perfectly smooth), and input lag difference is quite big. I can aim more easiy using DLSS FG at 30fps base (60fps with FG) compared to FSR FG at 100fps base (200fps with FG).
FG needs other Features Like NVIDIA Reflex which gets automaticly turned on as well when u activate fg.

Havent tried fsr3 fg only with fsr4 and picture quality and smoothness is very high.
Dont have expirience with NVIDIA fg tough. I would think its better but I cant really Imagine it being better to s Point where one would say its unusable in comparison
 

hinch7

Member
Joined the hype train. Preordered a Sapphire Nitro+ on Amazon UK but won't be getting it until some time in May, or hopefully sooner if in stock. A bit over MSRP but meh.

Will be my first AMD card since the HD 4850! Will mostly be playing games like CoD and other competitive games so should be a nice little upgrade over my 4070Ti.
 

Bry0

Member
Ok guys hear me out- does Nvidia have worse colors than AMD? Games looked far more vibrant with the 9070xt than the 5070ti.
When I switched from a 4080 to a 7900xtx I had the same thought. I’m sure it probably has to do with some default driver output settings though if there is adifderence.
 
FG needs other Features Like NVIDIA Reflex which gets automaticly turned on as well when u activate fg.

Havent tried fsr3 fg only with fsr4 and picture quality and smoothness is very high.
Dont have expirience with NVIDIA fg tough. I would think its better but I cant really Imagine it being better to s Point where one would say its unusable in comparison
It's hard to capture on camera what I'm seeing with my own eyes, but here's my comparison.

Edit, I decided to delete my video because I found a solution to this judder problem.
 
Last edited:

CrustyBritches

Gold Member
When I was using my 4060 Ti 16GB I tested DLSS FG vs FSR FG a decent amount in R&C Rift Apart and Spider-Man Remastered. Insomniac's games are some of the only that can mix DLSS with FSR FG. My impression was that DLSS FG had the edge in clarity, while FSR FG had a 10-15% performance advantage. Never experienced and stutter using either at decent base frame rate.

I just tried both of those games on 9070 XT using FSR4+FSR FG and they run silky smooth. Like, mind blowingly smooth.

Additionally, I was messing around with it in Cyberpunk. There's not any issues with stutter, but there is kind of a minor dark blur trail on some models at times, but iirc DLSS did that too. I'm using in-game FSR FG along with Optiscaler for FSR 4 upscaling. Sometimes that can exhibit minor, but weird, artifacting. Maybe I'll try it with regular FSR and see.
 
When I was using my 4060 Ti 16GB I tested DLSS FG vs FSR FG a decent amount in R&C Rift Apart and Spider-Man Remastered. Insomniac's games are some of the only that can mix DLSS with FSR FG. My impression was that DLSS FG had the edge in clarity, while FSR FG had a 10-15% performance advantage. Never experienced and stutter using either at decent base frame rate.

I just tried both of those games on 9070 XT using FSR4+FSR FG and they run silky smooth. Like, mind blowingly smooth.

Additionally, I was messing around with it in Cyberpunk. There's not any issues with stutter, but there is kind of a minor dark blur trail on some models at times, but iirc DLSS did that too. I'm using in-game FSR FG along with Optiscaler for FSR 4 upscaling. Sometimes that can exhibit minor, but weird, artifacting. Maybe I'll try it with regular FSR and see.
CrustyBritches CrustyBritches I trust your opinions given your post history, so after your comment, I spent 3 hours trying different games with different FSR settings trying to find out why I see judder with FSR FG on my PC, and I think I finally found a solution to that problem. I tested Horizon Horizon Remaster because I assumed that this game should have a newer FSR FG DLL compared to Robocop. I still saw judder, so I thought maybe I will turn off riva tunner framerate cap and see how the game runs without it. I still saw judder, but I also noticed that my framerate was way above my monitors refreshrate. I decided to limit my framerate with IN GAME vsync instead of riva tuner, and then I finally saw smooth motion without judder. With similar settings Robocop has smooth motion as well with FSR FG. After further testing, I realized that my Riva tuner was using the "reflex" frame limiter mode, and when I switched it to "async", the Riva tuner no longer caused any judder.

So at the moment FSR FG is working without judder on my RTX4080S, but that FSR FG input lag is still not great. However, I noticed that the antilag 2 option in horizon remaster is grayed out, and maybe on radeon card it would improve the input lag with FSR FG.
 

CrustyBritches

Gold Member
CrustyBritches CrustyBritches I trust your opinions given your post history, so after your comment, I spent 3 hours trying different games with different FSR settings trying to find out why I see judder with FSR FG on my PC, and I think I finally found a solution to that problem. I tested Horizon Horizon Remaster because I assumed that this game should have a newer FSR FG DLL compared to Robocop. I still saw judder, so I thought maybe I will turn off riva tunner framerate cap and see how the game runs without it. I still saw judder, but I also noticed that my framerate was way above my monitors refreshrate. I decided to limit my framerate with IN GAME vsync instead of riva tuner, and then I finally saw smooth motion without judder. With similar settings Robocop has smooth motion as well with FSR FG. After further testing, I realized that my Riva tuner was using the "reflex" frame limiter mode, and when I switched it to "async", the Riva tuner no longer caused any judder.

So at the moment FSR FG is working without judder on my RTX4080S, but that FSR FG input lag is still not great. However, I noticed that the antilag 2 option in horizon remaster is grayed out, and maybe on radeon card it would improve the input lag with FSR FG.
Happy to hear you were able to make some progress towards resolving that. I would agree that the latency is better with DLSS FG than FSR FG, especially when using KB/M. With a controller and solid 60fps+ base frame rate I don't notice much difference. Something to note as well is that in Adrenaline menu AMD says that AFMF may introduce additional latency and might not be optimal for competitive MP games. They also recommend enabling Radeon Anti-Lag when using FG. It seems to be AMD's version of Reflex. I wonder if Reflex can be used in conjunction with FSR FG? If not, that could impact input latency when using a Nvidia GPU with FSR FG. Looks like they also have Anti-Lag+ and they're doing tech preview for Anti-Lag 2. I haven't been on AMD since RX 480, so I'm only now researching Anti-Lag. This seems like an area AMD can really improve on compared to Nvidia.

I'm going to test Robocop tonight and see if there's any issues maybe specifically related to the game.
 

CrustyBritches

Gold Member
Corporal.Hicks Corporal.Hicks

I just tried out Robocop. I'm playing at 1440p using max settings, Lumen enabled, Optiscaler FSR4, FSR4 'Quality', in-game FSR FG, and in Adrenaline menu FSR4, Anti-Lag, and Enhanced Sync enabled, along with FreeSync forced 'On'. Getting 100fps+ without FG, and 200fps+ with FG. This runs extremely smooth, even with a mouse. Pure butter.

At first it was running ok, but looked a little weird, like blurry, and not perfectly smooth. I figured out that Optiscaler automatically enabled OptiFG, so it was running 2 instances of Frame Generation at the same time. :messenger_grinning_sweat: Now I'm just using in-game FSR FG. Looks way better and has much better input latency. Something like that happened with R&C Rift Apart when I first ran it, too. Frame rate was insanely high like 300-400fps, but it was stuttering.Then I realized it was running AFMF and FSR FG on top of each other. That's a slightly annoying quirk of dealing with AMD. I'm learning the ropes now so everything is going smoothly. Hoping for more native FSR 3.1 or FSR 4 officially supported games in the near future.

Also learned that some games like Cyberpunk have issues with crashing with Optiscaler on Win10. There's a workaround for it using the more recent Nightly Builds. I used it on Robocop as well:

DirectX 12 Agility SDK​

Note
With latest nightly PotatoOfDoom added experimental support for updating the DirectX 12 Agility SDK. Which would help games crashing on Windows 10. To make it work copy D3D12_Optiscaler folder in OptiScaler archive to games exe folder and set FsrAgilitySDKUpgrade=true in OptiScaler.ini

As a side note, even unofficial Optiscaler FSR 4 looks leagues better than old FSR 3. This is 'Performance' preset:
Check out the text on the liquor sign...
Robocop-FSR3-FSR4-Compare-Vertical2.jpg

Bigger area...
Robocop-FSR3-FSR4-Compare-Vertical1.jpg
Slider comparison

*Sorry for the info dump. It's not specifically in regards to our FSR FG testing. I just figured anybody buying a 9070-series should brush up on this, since this is how we get FSR 4 until there's more official implementations.
 
Last edited:
Corporal.Hicks Corporal.Hicks

I just tried out Robocop. I'm playing at 1440p using max settings, Lumen enabled, Optiscaler FSR4, FSR4 'Quality', in-game FSR FG, and in Adrenaline menu FSR4, Anti-Lag, and Enhanced Sync enabled, along with FreeSync forced 'On'. Getting 100fps+ without FG, and 200fps+ with FG. This runs extremely smooth, even with a mouse. Pure butter.

At first it was running ok, but looked a little weird, like blurry, and not perfectly smooth. I figured out that Optiscaler automatically enabled OptiFG, so it was running 2 instances of Frame Generation at the same time. :messenger_grinning_sweat: Now I'm just using in-game FSR FG. Looks way better and has much better input latency. Something like that happened with R&C Rift Apart when I first ran it, too. Frame rate was insanely high like 300-400fps, but it was stuttering.Then I realized it was running AFMF and FSR FG on top of each other. That's a slightly annoying quirk of dealing with AMD. I'm learning the ropes now so everything is going smoothly. Hoping for more native FSR 3.1 or FSR 4 officially supported games in the near future.

Also learned that some games like Cyberpunk have issues with crashing with Optiscaler on Win10. There's a workaround for it using the more recent Nightly Builds. I used it on Robocop as well:


As a side note, even unofficial Optiscaler FSR 4 looks leagues better than old FSR 3. This is 'Performance' preset:

Slider comparison

*Sorry for the info dump. It's not specifically in regards to our FSR FG testing. I just figured anybody buying a 9070-series should brush up on this, since this is how we get FSR 4 until there's more official implementations.
I have tried something similar in the past. I tried to run Alan Wake 2 with DLSS FGx2 and Lossless scaling FGx4 on top of that (just for fun and curiosity), and the game still worked :). I had over 400fps but the input lag was not to my liking :D. I could only aim at the enemies when they were moving very slowly, or not moving.

I know that FSR3.1 was terrible. The image quality on a static image was pretty good, but when something moved, the shimmering was excessive. Thankfully, AMD addressed this issue with FSR4. Modern games often require "upscaling" (image reconstruction) to run at high framerate, so FSR4 will help a lot 9070XT owners.
 

Boo Who?

Member
Anyone else grab the latest Optiscaler? In the setup .bat file there is now a 3rd question asking if you are going to use DLSS inputs. Not sure what it does exactly, but in Outlaws I said Yes. Outlaws saw my 9070 XT as a 4090 and I used DLSS and it worked fantastic.
 
Last edited:
Top Bottom