Ulysses 31
Member
Sounds like an issue people with CPU heatsinks will have to deal with, people with CPU AIO liquid cooling might not be so affected.
Last edited:
Sounds like an issue people with CPU heatsinks will have to deal with, people with CPU AIO liquid cooling might not be so affected.
yup. I was so down to buy one. I am so out of it now. just gonna find again a used 4090 and be done with it till 6000 series. i am not liking the 5080 for that much with 16 gig of ram. that is the minimum these days for 4k. so to me its either 4090 or 5090. when I can get a 4090 used for almost half of the price of a 5090 here in Canada.... 2k CAD vs at least 3.5 CAD... yeah ill stick with a 4090. not worth it.So 30% more performance using 30% more power and 25% price increase. So it's a 4090 TI.
5090 day 1. This AI stuff is going to pan out better than the curmudgeon old school PC folk think it will.
Soo, does this affect nvidia chips?
Around 20,000 TSMC Wafers Reported Damaged by Earthquake
Earlier this week, Taiwan experienced a magnitude 6.4 earthquake—this seismic event interrupted manufacturing activities at several TSMC chip-making facilities. As a precaution, foundry employees in both Central and Southern Taiwan were evacuated. Production resumed fairly quickly following...www.techpowerup.com
That’s cute. You actually think you’re gonna be able to get one.I'm pretty set for mine next Thursday. Got a good amount for my 4090.
I'm going to probably net pay 5-700 after selling mine and getting one of the 2200 models.
It's not for everyone but it is for me.
People aren't wrong when it comes to wanting more perf considering the bump.
But this is the .010% card. We who can get one enjoy it.
Good luck. Used 4090s are not cheap, at least not in my country they are going for 1.500-2.000 Euro and you have no warranty.yup. I was so down to buy one. I am so out of it now. just gonna find again a used 4090 and be done with it till 6000 series. i am not liking the 5080 for that much with 16 gig of ram. that is the minimum these days for 4k. so to me its either 4090 or 5090. when I can get a 4090 used for almost half of the price of a 5090 here in Canada.... 2k CAD vs at least 3.5 CAD... yeah ill stick with a 4090. not worth it.
So they'll have to become more innovative or efficient for the next generations to come till GPU cards can draw more than 600W.Damn. its wild thinking that gpu developers can't extract raw performance anymore.
Like they hit the wall
Damn. its wild thinking that gpu developers can't extract raw performance anymore.
Like they hit the wall
What turns me down the most is the high power consumption. Electricity isn't cheap around here, and seeing the increase in power is linear with its rise in computational power over the 4090 it means basicly there was no evident raise in efficency. So not that compelling of a product.yup. I was so down to buy one. I am so out of it now. just gonna find again a used 4090 and be done with it till 6000 series. i am not liking the 5080 for that much with 16 gig of ram. that is the minimum these days for 4k. so to me its either 4090 or 5090. when I can get a 4090 used for almost half of the price of a 5090 here in Canada.... 2k CAD vs at least 3.5 CAD... yeah ill stick with a 4090. not worth it.
From what I understand (someone correct me if I'm wrong) if you're getting 60 FPS normally, it will display 200+ FPS with it on. While it will look like 200 FPS on screen, it will still feel like you're playing at 60 with input lag.So it appears the new MFG doesn’t actually add any latency? Nor does it actually decrease base performance?
Apparently, yes. But for the rest is very similar to the "old" frame gen, so you can expect visible artifacting unless your game is already running at a high refresh rate.So it appears the new MFG doesn’t actually add any latency? Nor does it actually decrease base performance?
It´s really not the GPU makers but the foundries that have "hit a wall" unfortunately.Damn. its wild thinking that gpu developers can't extract raw performance anymore.
Like they hit the wall
So, beside the fact that it seems you need liquid coolng to save your cpu - what life span can you expect from a card of this caliber? If the computer is sized appropriately, that is.
Do you have a source for this blurb? Couple of things standout that make me question this.
It’s always amusing to see people selling what they don’t possess.Of course these leachers would how up...
No. Plenty of blame falls on developers. They suck ass at optimization and are over relying on DLSS and Frame gen.It´s really not the GPU makers but the foundries that have "hit a wall" unfortunately.
Sounds like an issue people with CPU heatsinks will have to deal with, people with CPU AIO liquid cooling might not be so affected.
uhm we`re talking about hardware performance here, so.....No. Plenty of blame falls on developers. They suck ass at optimization and are over relying on DLSS and Frame gen.
It’s only going to get worse.
Most of my dev friends are pissed at the current state of the market because what they're adding is basically crutches.No. Plenty of blame falls on developers. They suck ass at optimization and are over relying on DLSS and Frame gen.
It’s only going to get worse.
So one issue I'm not seeing talked about enough is while the 5090 has an amazing cooler, where that extra heat from the wattage is being dumped?
It's being dumped right back into the case leading to stuff like this even with good airflow cases.
This is why they really need too focus on getting that wattage back down in future gens.
I think this will be a headline issue in the coming months.
That depends, think about where AIO CPU coolers normally sit and the rotation of the fans, they are normally on the top or back of the case (in rare instances) and are outake fans.
Meaning they take air from inside the case and then blow it out, that's why you can see the fans at the bottom of the heatsink, that means for an overwhelming majority of designs you are still expected to take air from inside the case which is where the 5090 is dumping the hot air, so you get the exact same issue with 90% of the AIO's in the market. Some can be fitted so their set as an intake at the front which could potentially avoid the issue, but then the 5090 will be starting with much hotter air coming into the case and you have potentially a much worse issue due to how hot the memory on the 5090 runs already.
This has been an issue for a while but yeah it’s going to be much worse with these 575w cards. The heat blows upward where it gets sucked right into the CPU tower cooler, or into the AIO radiator if you have it mounted on top.So one issue I'm not seeing talked about enough is while the 5090 has an amazing cooler, where that extra heat from the wattage is being dumped?
It's being dumped right back into the case leading to stuff like this even with good airflow cases.
This is why they really need too focus on getting that wattage back down in future gens.
I think this will be a headline issue in the coming months.
This has been an issue for a while but yeah it’s going to be much worse with these 575w cards. The heat blows upward where it gets sucked right into the CPU tower cooler, or into the AIO radiator if you have it mounted on top.
IMO the ideal setup is to have the AIO side mounted as intake. Or if you mount it on top, at least make sure you have lots of cool air coming in through the front to displace as much of the GPU exhaust as possible before it gets sucked into the radiator.
Is that how we call greed now?Looks like moore’s law is in full effect. It’s affecting everyone from Sony to amd and nvidia.
At the rate this is going I'm kind of surprised that case manufacturers haven't tried putting in a divider between the CPU and GPU sections and then setup sets of fans to blow air between the 2 sections.This has been an issue for a while but yeah it’s going to be much worse with these 575w cards. The heat blows upward where it gets sucked right into the CPU tower cooler, or into the AIO radiator if you have it mounted on top.
IMO the ideal setup is to have the AIO side mounted as intake. Or if you mount it on top, at least make sure you have lots of cool air coming in through the front to displace as much of the GPU exhaust as possible before it gets sucked into the radiator.
So one issue I'm not seeing talked about enough is while the 5090 has an amazing cooler, where that extra heat from the wattage is being dumped?
It's being dumped right back into the case leading to stuff like this even with good airflow cases.
This is why they really need too focus on getting that wattage back down in future gens.
I think this will be a headline issue in the coming months.
That depends, think about where AIO CPU coolers normally sit and the rotation of the fans, they are normally on the top or back of the case (in rare instances) and are outake fans.
Meaning they take air from inside the case and then blow it out, that's why you can see the fans at the bottom of the heatsink, that means for an overwhelming majority of designs you are still expected to take air from inside the case which is where the 5090 is dumping the hot air, so you get the exact same issue with 90% of the AIO's in the market. Some can be fitted so their set as an intake at the front which could potentially avoid the issue, but then the 5090 will be starting with much hotter air coming into the case and you have potentially a much worse issue due to how hot the memory on the 5090 runs already.
Should be interesting to see. But in my experience, unless you’re running a very power hungry CPU, the temperature differential between air going into the radiator vs coming out of it is only a few degrees. And ideally you’ll have intake fans on the bottom of your case blowing cool air directly into the GPU so barely any radiator exhaust will get sucked into it anyway.Only issue with that is outside of a shared custom loop (with im expecting to see some videos in the coming months) the 5900 runs it's memory hot compared to the 4090. So while the cooler is very efficient it seems to have issues with that. This could potentially mean increasing the intake temperature via having the CPU cooler first before hitting the GPU could push that into critical temps and throttling.
But I will leave that up to the experts to test it just don't really like these temps.
I might just get a 5080, because while it's not much of an upgrade to a 4080 (im actually excited about the frame generation and how Reflex 2 interacts with it) it is an upgrade to my 10GB RTX 3080.
I’ve always wondered the same thing! Like with the Torrent, imagine having a divider where one of those 180mm front fans blows straight into the CPU cooler and the other feeds the GPU (although this would be a little tricky now because newer GPUs have that flow-through design).At the rate this is going I'm kind of surprised that case manufacturers haven't tried putting in a divider between the CPU and GPU sections and then setup sets of fans to blow air between the 2 sections.
So one issue I'm not seeing talked about enough is while the 5090 has an amazing cooler, where that extra heat from the wattage is being dumped?
It's being dumped right back into the case leading to stuff like this even with good airflow cases.
This is why they really need too focus on getting that wattage back down in future gens.
I think this will be a headline issue in the coming months.
That depends, think about where AIO CPU coolers normally sit and the rotation of the fans, they are normally on the top or back of the case (in rare instances) and are outake fans.
Meaning they take air from inside the case and then blow it out, that's why you can see the fans at the bottom of the heatsink, that means for an overwhelming majority of designs you are still expected to take air from inside the case which is where the 5090 is dumping the hot air, so you get the exact same issue with 90% of the AIO's in the market. Some can be fitted so their set as an intake at the front which could potentially avoid the issue, but then the 5090 will be starting with much hotter air coming into the case and you have potentially a much worse issue due to how hot the memory on the 5090 runs already.