• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel still has not fixed the crashes with 13th and 14th gen CPUs

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
I'm thinking of getting a 12400 because it's the last i5 CPU without e-cores. It also seems like it has a great performance/price ratio. And i think it won't be any more power hungry than my current 4670.

It's not a new CPU but compared to what i have it will be a big difference either way for a very low price. The 78003D is way too expensive for me.

Intel's 12th gen CPUs are unaffected by these problems. Versions with e-cores are fine.
 
Are you sure that something running in the background isn't using resources and waking up your processor cores? I remember that some monitoring software bring issues.
Positive. I monitor power consumption at the wall using my battery backup. With my old Intel rig (and identical other components minus CPU, mobo and RAM) my total system power draw at idle was around 161w. Now after upgrading to the 7950x3D, idle power draw is at minimum 198w but when doing even light loads like watching twitch it shoots up to 225w. Meanwhile the 7700k under that same scenario wouod only go up to 170w. This chip sucks down power like crazy at idle and it's all because of the chiplet and system on chip design. It's not the cores that use so much power, it's the system agent that handles all the memory and IO transfers on the chip itself. With EXPO off, it's still higher than Intel but the gap shrinks significantly. Problem is you sacrifice a ton of performance on AMD with RAM at JEDEC speeds. Sucks.
 

winjer

Member
Positive. I monitor power consumption at the wall using my battery backup. With my old Intel rig (and identical other components minus CPU, mobo and RAM) my total system power draw at idle was around 161w. Now after upgrading to the 7950x3D, idle power draw is at minimum 198w but when doing even light loads like watching twitch it shoots up to 225w. Meanwhile the 7700k under that same scenario wouod only go up to 170w. This chip sucks down power like crazy at idle and it's all because of the chiplet and system on chip design. It's not the cores that use so much power, it's the system agent that handles all the memory and IO transfers on the chip itself. With EXPO off, it's still higher than Intel but the gap shrinks significantly. Problem is you sacrifice a ton of performance on AMD with RAM at JEDEC speeds. Sucks.

You might not have the power saving mode enabled. Or you might have some background process causing high cpu usage.

Professional reviewers show a much lower idle power usage.
dOkrble.png
 

There is something seriously wrong here and Intel doesn't seem to have a handle on it
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.


Is there an AI service that can condense the content of a 16 minute Youtube video to a few paragraphs? This guy keeps rambling on without getting to the point. It's excruciating.
 

winjer

Member
Is there an AI service that can condense the content of a 16 minute Youtube video to a few paragraphs? This guy keeps rambling on without getting to the point. It's excruciating.

He is reiterating what level1Tech's investigation found. But also putting the caveat that he had already encountered this issue with some 13900K, back when they launched.
He feels vindicated, because few people at the time believed in him, but now he is proven right.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
He is reiterating what level1Tech's investigation found. But also putting the caveat that he had already encountered this issue with some 13900K, back when they launched.
He feels vindicated, because few people at the time believed in him, but now he is proven right.

I made it to the 7 minute mark. He was going on about his video editing PC not being snappy and blamed it on changes to the input-output hub (Northbridge) on the 13900K CPU. But his performance problems are not at all similar to the crashes people are complainging about it. Besides that, from what I've read here the Northbridge change already happened with the 12900K which is unaffected. The people on that Reddit thread think the Tech Yes City Youtuber is a complete idiot.
 
Last edited:

winjer

Member
I made it to the 7 minute mark. He was going on about his video editing PC not being snappy and blamed it on changes to the input-output hub (Northbridge) on the 13900K CPU. But his problems is not what people are experiencing. Besides that, from what I've read here the Northbridge change already happened with the 12900K which is unaffected. The people on that Reddit thread think the Tech Yes City Youtuber is a complete idiot.

The IO errors are probably just one of the symptom of the real issues inside the CPU.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
The IO errors are probably just one of the symptom of the real issues inside the CPU.

He wasn't experiencing errors, his complaint was latency. Completely different story.



 
You might not have the power saving mode enabled. Or you might have some background process causing high cpu usage.

Professional reviewers show a much lower idle power usage.
dOkrble.png
Guaranteed those results are at bonedry stock with EXPO off. Turning EXPO on boosts VCCSA from 1.05v to 1.25v, and massively increases the SoC power consumption. His figures puts the 7950x3D at 3w higher power draw than the i7 7700, not even the K model, which is just BS and proven by other reviewers. The X670 chipset he used alone would account for 14w more power draw here, and I'm using a B650 so that's a 7w saving I have over him. Add it all up, he's not using EXPO in those power tests. I'd guarantee if someone has a 7950x3D with EXPO enabled, they cannot produce a package power draw of less than 30w on the CPU, something the 7700k beats by a factor of 5 (5w idle vs 30w lowest on 7950x3D.)

---------------

Edit, here's a link to a thread talking about this very issue:

-----

Final edit, try measuring your system power draw while watching streams in Chrome on an Intel PC vs AMD PC. The difference is staggering. Intel rigs will be fully idle, 800mhz core clocks, super low package voltage. The AMD system with its retarded auto-boost algorithms, keeps the chip pegged at 1.1-1.4v range all the fucking time. Just watching a basic 720p30 stream in the browser will cause this massive spike to the soft idle power draw. AMD chips suck at this and it's an objective fact.
 
Last edited:

winjer

Member
Guaranteed those results are at bonedry stock with EXPO off. Turning EXPO on boosts VCCSA from 1.05v to 1.25v, and massively increases the SoC power consumption. His figures puts the 7950x3D at 3w higher power draw than the i7 7700, not even the K model, which is just BS and proven by other reviewers. The X670 chipset he used alone would account for 14w more power draw here, and I'm using a B650 so that's a 7w saving I have over him. Add it all up, he's not using EXPO in those power tests. I'd guarantee if someone has a 7950x3D with EXPO enabled, they cannot produce a package power draw of less than 30w on the CPU, something the 7700k beats by a factor of 5 (5w idle vs 30w lowest on 7950x3D.)

---------------

Edit, here's a link to a thread talking about this very issue:

-----

Final edit, try measuring your system power draw while watching streams in Chrome on an Intel PC vs AMD PC. The difference is staggering. Intel rigs will be fully idle, 800mhz core clocks, super low package voltage. The AMD system with its retarded auto-boost algorithms, keeps the chip pegged at 1.1-1.4v range all the fucking time. Just watching a basic 720p30 stream in the browser will cause this massive spike to the soft idle power draw. AMD chips suck at this and it's an objective fact.


Nope. Guru3d is using 6000mbps memory. You can see the review here.
And as you can see the idle power usage of the Zen4 and 13th gen is not that dissimilar.
Seems to me these people either have issues with their PCs, or are measuring power usage in different ways. Probably because they look at PPT for AMD, and only for cores for Intel.
But the reality is that Intel's PPT never idles at 7W. It's closer to 20-30W.


And PCworld reports similar results to Guru3d. Look at their Blender power test. Before and after the test, these CPUs all go to idle, and the system power usage is very similar between Intel and AMD.

 

PnCIa

Member
Both CPU makers suck in their own ways. My 7950x3D idles at 40-50w with EXPO enabled. Since my PC is used for streaming and browsing a lot this was a nasty surprise coming from Intel. Also burned up two chips from the EXPO SoC bug last year. It hasn't been sunshine and rainbows over here. At the same time, I'm disgusted with what Intel is doing so they're no better an option today. The choices are not easy to make like they were a decade ago.
The choice is actually easy. One company sells clearly broken chips atm, the other one does not. This thread is about intels current business pratices, not amds idle power consumption - referencing your other posts in this thread.
 

cebri.one

Member
Looks more and more like a very significant issue overlooked during quality testing.

Either Intel is still figuring out the problem, or they know what the problem is and it cannot be solved via a microcode or a chipset driver update. If it's hardware based they are totally fucked, they will be forced to do a major recall for all i9 RPL and RPL-R CPUs. Not only a huge financial impact but also a reputational one, Intel has always been the brand to go to when you wanted stability and that was a major selling point for them.
 
Last edited:

cebri.one

Member
Intel and Boeing sharing the same QA department.
Desperation is a common cause. Boeing was caught on the wrong foot when Airbus unveiled the neo and Intel realized they were in deep trouble when AMD caught up in IPC with Zen 4.

Shappire Rapids had also a lot of quality issues and was delayed almost 9 months.

The hope for Intel is that Gelsinger is a much better CEO than anything Boeing can promote right now, has been for the most part very transparent about the degrading culture within Intel and has fired a lot of incompetent people and hired real engineering talent. Imo Intel is about to turn a corner, but this is a PR mess he needs to solve.
 
Last edited:
Nope. Guru3d is using 6000mbps memory. You can see the review here.
And as you can see the idle power usage of the Zen4 and 13th gen is not that dissimilar.
Seems to me these people either have issues with their PCs, or are measuring power usage in different ways. Probably because they look at PPT for AMD, and only for cores for Intel.
But the reality is that Intel's PPT never idles at 7W. It's closer to 20-30W.


And PCworld reports similar results to Guru3d. Look at their Blender power test. Before and after the test, these CPUs all go to idle, and the system power usage is very similar between Intel and AMD.


My guy, you have an entire thread of people proving the problem is the SoC voltage, that is required to run EXPO at high speeds and capacities. That the chip (and power hungry chipsets) are largely to blame. We measure the power at the wall which is the end all be all final tally. I'm telling you right now, taking the exact same PC components and just swapping out the CPU, motherboard and RAM, EVEN AT BONE DRY STOCK, the AMD rig is pulling 20-50w more than the Intel one.

You also didn't touch on what I said about using light loads like watching streams in a web browser how the Intel chip handles this at super low clock speeds and voltages, while the AMD chip wants to boost full tilt and pushes the gap up to a staggering 50w under like-for-like conditions. Even IF the AMD system could sort of semi match Intel at full idle, who the fuck boots their PC to stare at the desktop? That's pointless. It's using the PC in a low power state where the gap is most measurable and your faulty guru3d charts don't prove a damn thing in this arena. Again, I challenge you (assuming you're also an AMD system owner) to prove right now with Twitch open playing a stream, that your total system power isn't over 80w. Guaranteed you can't because the chip is impossible to operate at such low power draw while doing an extremely simple task that my smartphone can do without boosting to insane power draw levels.
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Looks more and more like a very significant issue overlooked during quality testing.

Either Intel is still figuring out the problem, or they know what the problem is and it cannot via a microcode or a chipset driver update. If it's hardware based they are totally fucked, they will be forced to do a major recall for all i9 RPL and RPL-R CPUs. Not only a huge financial impact but also a reputational one, Intel has always been the brand to go to when you wanted stability and that was a major selling point for them.

I agree. I think Intel are in the same situation MS was enormous numbers of Xbox 360s were dying because of the Red Ring of Death. When 50% of all 13900K/149000Ks are experiencing these failures in server environments then it's not if but when regular people experience the same failures. The right thing to do is to replace each and every one of them, it's going to cost a fortune but would save Intel's reputation as a CPU producer. This debacle could kill Intel's hope of becoming a leading chip manufacturer for third parties though.

Ouch.
 

cebri.one

Member
I agree. I think Intel are in the same situation MS was enormous numbers of Xbox 360s were dying because of the Red Ring of Death. When 50% of all 13900K/149000Ks are experiencing these failures in server environments then it's not if but when regular people experience the same failures. The right thing to do is to replace each and every one of them, it's going to cost a fortune but would save Intel's reputation as a CPU producer. This debacle could kill Intel's hope of becoming a leading chip manufacturer for third parties though.

Ouch.
IFS (chip making) is independent from this. This is either a design issue or a configuration flaw. Intel chip manufacturing costumers bring their own design and do their own quality testing.
 
Last edited:

winjer

Member
My guy, you have an entire thread of people proving the problem is the SoC voltage, that is required to run EXPO at high speeds and capacities. That the chip (and power hungry chipsets) are largely to blame. We measure the power at the wall which is the end all be all final tally. I'm telling you right now, taking the exact same PC components and just swapping out the CPU, motherboard and RAM, EVEN AT BONE DRY STOCK, the AMD rig is pulling 20-50w more than the Intel one.

You also didn't touch on what I said about using light loads like watching streams in a web browser how the Intel chip handles this at super low clock speeds and voltages, while the AMD chip wants to boost full tilt and pushes the gap up to a staggering 50w under like-for-like conditions. Even IF the AMD system could sort of semi match Intel at full idle, who the fuck boots their PC to stare at the desktop? That's pointless. It's using the PC in a low power state where the gap is most measurable and your faulty guru3d charts don't prove a damn thing in this arena. Again, I challenge you (assuming you're also an AMD system owner) to prove right now with Twitch open playing a stream, that your total system power isn't over 80w. Guaranteed you can't because the chip is impossible to operate at such low power draw while doing an extremely simple task that my smartphone can do without boosting to insane power draw levels.

I have a 5800x3d with the IF at 1900.
At idle, I have a ppt of 23w. Watching video raises to 38w.
From what I see from people with 13900k, their idle ppt is around 20w.

But enough with the offtopic.
Create a thread, if you think its worth it
 

cebri.one

Member
Warframe devs do say the BIOS update helped.

After updating his BIOS to the latest he hasn’t crashed in nvgpucomp64.dll since and we’re optimistic that the weird crashes that only he was getting won’t be back either. We’re not positive that it was the issue described by the report linked above but we’re happy that updating the BIOS helped.
 
Intel really needs to be transparent with these degrading issues. Hardware Unboxed's video regarding the mobo manufacterers' frustrations with Intel was quite a while ago. So far, it looks like Intel is hoping that this will just blow away and people will forget when Arrow Lake launches. However, that sort of strategy does little to assuage concerns over the CPUs' reliability.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I would be fucking furious If I spent a 1000$ on a cpu just to see it degrade…fuck Intel

A 14900K is like 600 dollars......nothing to sneeze at but a far cry from 1000 dollars.



Who the fuck is buying 600 dollar CPUs?
cyberpunk-2077-2560-1440.png
 

winjer

Member

The developers from Warframe compiled the data of various Intel CPUs, which showed the "nvgpucomp64.dll" error code, which is an error state of NVIDIA drivers, likely occurring when the processor is in a "power hungry" state and being pushed hard. The data showed that Intel's Core i9-13900K accounted for 29.5% of game crashes with the error code, while the Intel Core i9-14900K and the Core i9-14900KF had up to 34% of the portion combined. The rest of the share lay with multiple models, such as the Core i7-14700K, but the common trend present was that 13th Gen & 14th Gen CPUs were vulnerable to the problem.

nvgpucomp64.dllcrashes.png.4a8d5497d37d35c7d88cc44ab1ced9e3.png
 
They need to either fix TB 3.0 or drop it. Maybe try taking a page from SSDs (wear leveling) and shift the duty evenly between cores?
 

JRW

Member
13600K / 32GB DDR4 3600 / MSI Z690 Pro-A hasn't been powered down since October 2022 (other than bios updates / windows update reboots) Zero issues to report.

I'm curious to see what Steve reveals about the root issue after getting enough failing CPU samples from viewers.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
We are not all American hahaha

Well when you use the $ we assume you mean US Dollars......which is why when people say games now cost 70 dollars we dont question whether they mean Surinamese Dollars cuz that would be an absolute bargain.
US Dollar is an easy metric for all other currencies to jump off of.
I havent needed to use US Dollars since like 2009 but I can still translate USD to whatever currency im currently using easily because everyone "should" know the exchange rate to their preferred currency.
 
Last edited:
You also didn't touch on what I said about using light loads like watching streams in a web browser how the Intel chip handles this at super low clock speeds and voltages, while the AMD chip wants to boost full tilt and pushes the gap up to a staggering 50w under like-for-like conditions. Even IF the AMD system could sort of semi match Intel at full idle, who the fuck boots their PC to stare at the desktop? That's pointless. It's using the PC in a low power state where the gap is most measurable and your faulty guru3d charts don't prove a damn thing in this arena. Again, I challenge you (assuming you're also an AMD system owner) to prove right now with Twitch open playing a stream, that your total system power isn't over 80w. Guaranteed you can't because the chip is impossible to operate at such low power draw while doing an extremely simple task that my smartphone can do without boosting to insane power draw lelevels.
According to this data, modern Intel CPUs are even worse at handling light loads than AMD CPUs, so what exactly are you trying to prove?


power-applications-compare.png


power-applications-compare-vs-13900k.png
 

Celcius

°Temp. member
Do you guys think meteor lake this fall will have similar issues down the road, or it’s just this platform?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Do you guys think meteor lake this fall will have similar issues down the road, or it’s just this platform?

MeteorLake already came out named Core Ultra 3, 5, 7 matching Core i 3, 5, 7 and doesnt seem to exhibit any of these issue.

135037.png


I take it you are talking about ArrowLake?


MetoerLake 1st generation Core Ultra, only came out to mobile the desktop SKU was scrapped but its leftovers could still be used to make lowend ArrowLake chips because of Intels Tile based designs.
ArrowLake/LunarLake 2nd generation Core Ultra.
 
Top Bottom