• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

12VHPWR on RTX 5090 is Extremely Concerning

Ulysses 31

Member
Last edited:
The secret to amazing PC gaming performance is a 1080p/240hz monitor. No single hardware purchase gave me a better FPS boost. Dumping 4k for a decade is a wise move. We can circle back in 2030 and see if the mass market hardware is there yet.
The 4K display is more universal than 1080p, because it allows you to scale 240p / 480p / 720p / 1080p resolutions with perfect pixel ratios, and what's more, the 4K pixel density also allows you to recreate the CRT phosphor mask, so 240p /480p content can look comparable to real CRT. When 1080p is upscaled correctly to 4K sharpness and clarity will look amazing (like 1080p dipslayd on 1920x1080 monitor),

Also, 1440p on 32-inch 4K even without upscaling is as big as a typical 1440p monitor :p, so IMHO even 1440p is usable if you only adjust the viewing distance.

There's just one problem to adress. Default upscaling (bilinear filtering) looks like crap, because it blurs the image. Nearest neighbour can scale the image with perfect sharpness, but the image will look pixelated because when pixels are scaled up, they look like squares (on native resolution displays, glowing pixels appear round to the human eye, therefore we dont see pixelation). Phosphor mask in reshade can adress this problem though.

You have a 1080p display, so I took screenshots in that resolution. Please view these scrreenshots in full screen (1:1), otherwise mask will be not aligned correctly on the second image. The first screenshot shows 480p upscaled to 1080p with bilinear filtering (standard upscaling). The first screenshot looks blurry, while the second looks sharp, and I ran the game at the same internal resolution of 480p.

bilinear filtering

bilinear-filtering.jpg



Nearest neighbour + reshade filter

nearest-neighbour-with-reshade-filter.jpg


I can use the same method to upscale 720p / 1080p to 4K, and get sharp image, therefore 4K display can be used for both old games and new games.

Also you dont need RTX5090 to play games at 4K. Even RTX4070S already get an average of 60fps at 4K native. Add to that DLSS and FG, and if you don't hit the VRAM limit, even 4K games run amazingly well. I think 4070S might be even faster than the PS5Pro.

average-fps-3840-2160.png
 
Last edited:

simpatico

Member
The 4K display is more universal than 1080p, because it allows you to scale 240p / 480p / 720p / 1080p resolutions with perfect pixel ratios, and what's more, the 4K pixel density also allows you to recreate the CRT phosphor mask, so 240p /480p content can look comparable to real CRT. When 1080p is upscaled correctly to 4K sharpness and clarity will look amazing (like 1080p dipslayd on 1920x1080 monitor),

Also, 1440p on 32-inch 4K even without upscaling is as big as a typical 1440p monitor :p, so IMHO even 1440p is usable if you only adjust the viewing distance.

There's just one problem to adress. Default upscaling (bilinear filtering) looks like crap, because it blurs the image. Nearest neighbour can scale the image with perfect sharpness, but the image will look pixelated because when pixels are scaled up, they look like squares (on native resolution displays, glowing pixels appear round to the human eye, therefore we dont see pixelation). Phosphor mask in reshade can adress this problem though.

You have a 1080p display, so I took screenshots in that resolution. Please view these scrreenshots in full screen (1:1), otherwise mask will be not aligned correctly on the second image. The first screenshot shows 480p upscaled to 1080p with bilinear filtering (standard upscaling). The first screenshot looks blurry, while the second looks sharp, and I ran the game at the same internal resolution of 480p.

bilinear filtering




Nearest neighbour + reshade filter



I can use the same method to upscale 720p / 1080p to 4K, and get sharp image, therefore 4K display can be used for both old games and new games.

Also you dont need RTX5090 to play games at 4K. Even RTX4070S already get an average of 60fps at 4K native. Add to that DLSS and FG, and if you don't hit the VRAM limit, even 4K games run amazingly well. I think 4070S might be even faster than the PS5Pro.
Here's the thing guy: when you play at 1080p, you never have to worry about scaling because it's deliciously, crispily native. You know the little, tiny hard fries you get at the bottom of the paper cup from McDonalds? That's what my graphics look like. Crispy. Unblurred. Defined. Sharp. Resolved. Salty.

My retro gaming is entirely done on a CRT TV and OG hardware (I do get rep games whenever I can on the costly stuff). $3,5k will get you a nice 7800xt, every retro console on the plant, a nice 32" CRT with good inputs with enough left over for steak dinner and nice bottle of bourbon.

5090 can't even pull off 60fps at 4k native on games that are over a year old. I don't know who allows themself to buy a $3k GPU that cannot play 1 year old games at native/60, but they're out there. My guess, PC gamers of yore that are nearing retirement age, have some extra cash pooling up and just want to enjoy the spoils of their work. (I'm spending that money on a schizo prepper compound personally).

Who is confidently buying a 4070S with a 4k primary gaming monitor? Why didn't their friends step in and stop them? I get that it can be hard to wait. If I hadn't been at this for so long I might be tempted too. Rarely in the history of PC gaming can I think of a time when a halo tier card came out and could not run old stuff at basic native/60. It's bonkers more so when you consider the price. I've hit the 12GB cap in my card on numerous games at 1080p. Lord protect the 4k 4070S gamer.

My gently used 6750xt doesn't even cover the tax on a 5090, but the $3,300 difference in price basically only unlocks the ability for me to turn on a couple graphics options in Wukong and Cyberpunk. Truly not much else. This is coming from a person who bought every Nvidia top end card between the GTX 280 and the GTX 1080. Including the Titans on the 5xx and 6xx series. Maybe it's people who haven't historically bought halo cards so they don't have a baseline for what to expect.
 
Last edited:
Dunno what you think of Falcon Northwest or how deep your pockets are but their pre-built can include a 5080/5090 FE. :lollipop_winking:

I may have to check them out if I get to really wanting a 5090 machine back before Best Buy can get stock going

I usually stick with Best Buy because we have worked together for 20+ years and they give me a lot of store credit every year
 

Celcius

°Temp. member
Glad to still be on my RTX 3090, aka the most power videocard nvidia ever made without the new power connector
I don't think there's any way nvidia backs away from the new power connector due to the money involved and needing it to make their FE PCB compact but I really think they need to walk this back and go back to what worked before.
 
Last edited:
Dunno what you think of Falcon Northwest or how deep your pockets are but their pre-built can include a 5080/5090 FE. :lollipop_winking:

So went down this rabbit hole and price wasn't horrible for a boutique build of about $6500 and got to checkout and ETA of build was 82 days

Bryan Cranston Reaction GIF
 
Here's the thing guy: when you play at 1080p, you never have to worry about scaling because it's deliciously, crispily native. You know the little, tiny hard fries you get at the bottom of the paper cup from McDonalds? That's what my graphics look like. Crispy. Unblurred. Defined. Sharp. Resolved. Salty.

My retro gaming is entirely done on a CRT TV and OG hardware (I do get rep games whenever I can on the costly stuff). $3,5k will get you a nice 7800xt, every retro console on the plant, a nice 32" CRT with good inputs with enough left over for steak dinner and nice bottle of bourbon.

5090 can't even pull off 60fps at 4k native on games that are over a year old. I don't know who allows themself to buy a $3k GPU that cannot play 1 year old games at native/60, but they're out there. My guess, PC gamers of yore that are nearing retirement age, have some extra cash pooling up and just want to enjoy the spoils of their work. (I'm spending that money on a schizo prepper compound personally).

Who is confidently buying a 4070S with a 4k primary gaming monitor? Why didn't their friends step in and stop them? I get that it can be hard to wait. If I hadn't been at this for so long I might be tempted too. Rarely in the history of PC gaming can I think of a time when a halo tier card came out and could not run old stuff at basic native/60. It's bonkers more so when you consider the price. I've hit the 12GB cap in my card on numerous games at 1080p. Lord protect the 4k 4070S gamer.

My gently used 6750xt doesn't even cover the tax on a 5090, but the $3,300 difference in price basically only unlocks the ability for me to turn on a couple graphics options in Wukong and Cyberpunk. Truly not much else. This is coming from a person who bought every Nvidia top end card between the GTX 280 and the GTX 1080. Including the Titans on the 5xx and 6xx series. Maybe it's people who haven't historically bought halo cards so they don't have a baseline for what to expect.
1080p can look good from a normal viewing distance, but a 4K monitor can display much better detail because of much higher pixel density (4K has 8 million pixels, whereas 1080p only has 2 million). This difference is very noticeable on a monitor, because people sit much closer compared to TV.

The 4K display requires some effort to upscale the lower resolution picture with good results (sharp image instead of blurry), but it can be done. Also 1440p is very usable on 32inch 4K panel even without upscaling, becasue the image will be still fairly big (comparable to typical 1440p monitor), so it's almost like having two monitors, and you dont need to worry about sharpness when you dont upscale the image.

The 5090 can run every single game at 4K native in 60fps but not with maxed out settings. Path tracing is extremely demanding, and any enthusiast will know that it's unreasonable to expect 60fps at 4K native, even on a high-end card. With DLSSQ + FGx2 even black myth wukong will run at 120fps and the image quality will still look 4K like (better than TAA native) and input lag in this paritcular game is even lower DLSS FG. Even 4K DLSS performance looks VASTLY supperior than 1080p on 1920x1080p display.

My RTX4080S can run the vast majority of games from my Steam library at 4K 120fps native, except for games with heavy RT. In the most demanding RT games I need to use DLSSQuality + FGx2 get 4K like image at 120fps therefore I cant complain. The RTX5090 would get 200-240fps with similar settings

I collected many TV's during the years because I used to think upscaling cant look good, but now I even think upscaling can improve the image quality, so I prefer gaming on modern displays.

RGB input vs S-Video vs upscale to 1080p.

20250215-172306.jpg


20250215-172354.jpg



nearest-neighbour-with-reshade-filter.jpg


I prefer upscaled picture, because on my SDR the blurriness is too strong, and 480i also means I get a much lower resolution during motion, so the image is way more pixelated compared to upscaled 480p. 480p almost looks like HD compared to 480i on my SD CRT.
 
Last edited:

Gonzito

Gold Member
Papis I have a 4090 and I am start to have anxiety symptoms in because of this situation. Am I screwed? Should I return it?
 

analog_future

Resident Crybaby
Papis I have a 4090 and I am start to have anxiety symptoms in because of this situation. Am I screwed? Should I return it?

It's not a great situation, but it's also a lot of FUD. It's not nearly as dire as the community might make you want to believe and a very tiny percentage of customers have actually been affected.
 

Gonzito

Gold Member
It's not a great situation, but it's also a lot of FUD. It's not nearly as dire as the community might make you want to believe and a very tiny percentage of customers have actually been affected.

I understand. I have read that it's actually dangerous to unplug the cable to even check the situation and, that is better to avoid doing that. Would you agree with that?
 

RoboFu

One of the green rats
It's not a great situation, but it's also a lot of FUD. It's not nearly as dire as the community might make you want to believe and a very tiny percentage of customers have actually been affected.

There's a very tiny percentage of these cards made. 😂
 

analog_future

Resident Crybaby
I understand. I have read that it's actually dangerous to unplug the cable to even check the situation and, that is better to avoid doing that. Would you agree with that?

I wouldn't agree that it's "dangerous", but I would say that if you haven't had any issues and you're confident your cable is plugged in securely, there's no need to check. And yes, you could potentially do more harm than good.

These cables are rated for a total of 30 connects/disconnects. Which is why I think we're seeing a disproportionate amount of tech reviewers/youtubers seeing problems compared to the average customer. These reviewers use the same cable and are changing out videocards practically every day. There cables have likely been connected/disconnected hundreds if not thousands of times, exponentially beyond what these cables are actually rated for. Whereas normal consumers probably unplug these cables like 10 times over several years.

I will probably personally check my cable once a year or so. And I do have a spare cable on hand that I will swap out if I find any issues.

There's a very tiny percentage of these cards made. 😂

True, but this same issue also would effect 4090 users (albeit less dramatically since the 4090 has less power draw). 4090s have been in the wild for years and actual real world issues have been extremely minimal.
 
Last edited:
Papis I have a 4090 and I am start to have anxiety symptoms in because of this situation. Am I screwed? Should I return it?
I have a 4090 going strong for over a year. I followed the advice of making sure the end of the cable isn't bent close to the PIN. And ensuring a good PSU (I got the 1000 ATX 3.0 PSU)
 

chakadave

Member
Gpus should be getting smaller and cooler not bigger and hotter. Tech fail 101.
you'd think. Laptops should be all we need but no. Least Apple has some nice battery and performance. Just not a very good compatibility layer. Metal seems to be getting better though.
 
Top Bottom