• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel's stock takes a big, runny dump

Dream-Knife

Banned
To keep their CPUs competitive, Intel had to push clocks higher and higher. And the result is that power efficiency got thrown out the window.

power-multithread.png
I dont really get why people care so much about efficiency in a desktop. It's plugged in.

Those high clocks are great for gaming.
 

Chiggs

Gold Member
Pretty interesting (and also ironic that Apple is not following this advice IMO).

Intel is in deep shit but I think this industry works in cycles. Hunger will motivate them to innovate and continue. They are too big to fail and the American government is not gonna allow a company like Intel to just go broke or get sold. I'd dare to say they are even a matter of strategic importance.

This is definitely true, but once their Ohio fab is built, they're going to have to:

A) have to build trust with their competitors
B) offer heavy discounts initially

It's doable for sure. I just get the sinking feeling we've seen the last of Intel's truly pioneering chip designs, and shifting into fabrication gives them an easy out.

Basically, the GM of chipmakers. :(

I dont really get why people care so much about efficiency in a desktop. It's plugged in.

Those high clocks are great for gaming.

Lower energy bills, lower temperatures? And then you have the fact that GPUs are going to start using ridiculous amounts of power, meaning that you'll quite possibly want a CPU that doesn't require a brand new PSU for it to work with an RTX 4000 series.
 
Last edited:

Dream-Knife

Banned
This is definitely true, but once their Ohio fab is built, they're going to have to:

A) have to build trust with their competitors
B) offer heavy discounts initially

It's doable for sure. I just get the sinking feeling we've seen the last of Intel's truly pioneering chip designs, and shifting into fabrication gives them an easy out.

Basically, the GM of chipmakers. :(



Lower energy bills, lower temperatures? And then you have the fact that GPUs are going to start using ridiculous amounts of power, meaning that you'll quite possibly want a CPU that doesn't require a brand new PSU for it to work with an RTX 4000 series.
PSUs are relatively inexpensive. If you are buying a new GPU, CPU, and motherboard you might as well just build up a new system.
 

Chiggs

Gold Member
PSUs are relatively inexpensive. If you are buying a new GPU, CPU, and motherboard you might as well just build up a new system.

Every dollar counts in a world where inflation runs rampant. And then you never know when the supply chain magically stops "supplying" because of hand-waiving infectious diseases and military tension.

Also, let me just say I particularly despise the logic you just touted. I make a fuck-ton of money every year and I still want to breathe fire when I see people like Caleb on Digital Trends say "Oh, who cares if the Sony A95K doesn't have 4 HDMI 2.1 ports, you should just buy this $1500.00 Denon receiver to handle your HDMI switching. After all, you're buying $4000 TV."
 
Last edited:
Every dollar counts in a world where inflation runs rampant. And then you never know when the supply chain magically stops "supplying" because of hand-waiving infectious diseases and military tension.

Also, let me just say I particularly despise the logic you just touted. I make a fuck-ton of money every year and I still want to breathe fire when I see people like Caleb on Digital Trends say "Oh, who cares if the Sony A95K doesn't have 4 HDMI 2.1 ports, you should just buy this $1500.00 Denon receiver to handle your HDMI switching. After all, you're buying $4000 TV."
Are you talking about power consumption, and having to buy a new psu? That's really not a big deal, and it's mostly the gpu manufacturers fault anyway.

You do realize amd CPU power consumption will go up this gen as well? This is not an Intel only thing.

Edit : the max tdp is 170 watt for Ryzen 7000, vs 105 for zen 3.

Gamers are really going to have to break from the amd bias just like they used to be biased in Intel's favor.
 
Last edited:

TransTrender

Gold Member
I dont really get why people care so much about efficiency in a desktop. It's plugged in.

Those high clocks are great for gaming.
OEM and Data Centers hate that because that's BOM cost for cooling and power as well as electricity for running and cooling these at hyperscalers.
 

AJUMP23

Parody of actual AJUMP23
Their stock is down 8%. Just like everyone else. It will go down more. Just like everyone but oil.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I dont really get why people care so much about efficiency in a desktop. It's plugged in.

Those high clocks are great for gaming.
Are you serious? Power draw matters. More power needs to more heat being generated and much more difficult to cool. This affects energy costs, which, if you haven't noticed are going up. As someone who already lives in a warm apartment that is very difficult to cool in the 100 degree summer, that is a problem.

Although it can be helpful in the winter.
 

JohnnyFootball

GerAlt-Right. Ciriously.
PSUs are relatively inexpensive. If you are buying a new GPU, CPU, and motherboard you might as well just build up a new system.
1000 watt power supplies are in excess of $100 and a PSU is not something you cheap out on. I buy PSUs to last several generations and most people do to. PSUs also increase in price drastically once they exceed 1000W.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Every dollar counts in a world where inflation runs rampant. And then you never know when the supply chain magically stops "supplying" because of hand-waiving infectious diseases and military tension.

Also, let me just say I particularly despise the logic you just touted. I make a fuck-ton of money every year and I still want to breathe fire when I see people like Caleb on Digital Trends say "Oh, who cares if the Sony A95K doesn't have 4 HDMI 2.1 ports, you should just buy this $1500.00 Denon receiver to handle your HDMI switching. After all, you're buying $4000 TV."
No shit. I don't invest in new PSUs the same way I upgrade new a CPU or GPU. WHenever, I buy a PSU, I want it to last for several generations. I love it how he says "they're relatively inexpensive" 1200W PSUs cost over $200 and often close to $400. Relatively inexpensive, right?!
 
Last edited:
1000 watt power supplies are in excess of $100 and a PSU is not something you cheap out on. I buy PSUs to last several generations and most people do to. PSUs also increase in price drastically once they exceed 1000W.
I tend to avoid parts that require that much power... but yeah don't cheap out on the PSU, a bad one will kill everything else in your system.
 

LordCBH

Member
Well apple is dropping them completely soon.

They’re really close to being fully weened off of Intel. The new M2 air has a custom made Thunderbolt controller as opposed to an Intel one. The biggest outstanding product left to convert is the Mac Pro, and I’m hoping we see something about it later this year.
 

kikkis

Member
They’re really close to being fully weened off of Intel. The new M2 air has a custom made Thunderbolt controller as opposed to an Intel one. The biggest outstanding product left to convert is the Mac Pro, and I’m hoping we see something about it later this year.
Isn't mac studio kind of like mac pro?
 

LordCBH

Member
Isn't mac studio kind of like mac pro?

I wouldn’t really say to to be honest. Though for everyone who’s not a business or isn’t filthy stinking rich, it might as well be. There’s still a lot of headroom for them to push to for a true Mac Pro replacement. I know a lot of people don’t want to hear it, but the Studio is more of a replacement for the discontinued 27” imac IMO. (Even though it isn’t an AIO like the imac)
 

BabyYoda

Banned
Intel CPUs are nice, but I see no reason to buy one over AMD. Their GPU segment completely missed the boat. Now we have a glut of Nvidia and AMD GPUs in the marketplace, who the hell will want an Intel GPU? Are they going to sell them at a loss?
The i5 11400 has no competition on the AMD side, I got it for £113, yes Intel are the best for budget cpu's now, who woulda thunk it!
 

jigglet

Banned
So take heart, Intel-faithful...

I mean that statement assumes you’re an AMD fanboy? I’m switching to AMD for my next rig. It’s just better. But I will be happy to switch to whatever is best at the time. I get being fanboys of sports clubs, developers, consoles, but CPU’s? I don’t get it.
 

winjer

Member
I dont really get why people care so much about efficiency in a desktop. It's plugged in.

Those high clocks are great for gaming.

Energy cost is going up in most countries. Inflation is sky high.
It heats up all other components in the case, causing thermal throttling.
And it heats up the house, which is bad, unless you live in Siberia.
 

Dream-Knife

Banned
Are you serious? Power draw matters. More power needs to more heat being generated and much more difficult to cool. This affects energy costs, which, if you haven't noticed are going up. As someone who already lives in a warm apartment that is very difficult to cool in the 100 degree summer, that is a problem.

Although it can be helpful in the winter.
It's called air conditioning. The energy cost differences are minimal.
1000 watt power supplies are in excess of $100 and a PSU is not something you cheap out on. I buy PSUs to last several generations and most people do to. PSUs also increase in price drastically once they exceed 1000W.
A good 1000w psu is over $200.

If you're upgrading all the time then you have to factor that into your expenses, or don't but such high end stuff. Or just build an entirely new PC since your changing everything anyway.
Energy cost is going up in most countries. Inflation is sky high.
It heats up all other components in the case, causing thermal throttling.
And it heats up the house, which is bad, unless you live in Siberia.
If inflation is so bad, then they shouldn't be upgrading their cpu or gpu if they can't afford a PSU. Things cost money.
 

Elog

Member
Not sure why there is so much discussion about their desktop chips in this thread. Intels problem is that the chiplet design template is generally superior for multi-core business applications. That has been Intel's profit machine in the past and since it is a slow-moving market it has taken time for AMD to eat real market share. We are now there.
 
I mean that statement assumes you’re an AMD fanboy? I’m switching to AMD for my next rig. It’s just better. But I will be happy to switch to whatever is best at the time. I get being fanboys of sports clubs, developers, consoles, but CPU’s? I don’t get it.
What do you mean, it's just better? What two CPU are you comparing?
 
Not sure why there is so much discussion about their desktop chips in this thread. Intels problem is that the chiplet design template is generally superior for multi-core business applications. That has been Intel's profit machine in the past and since it is a slow-moving market it has taken time for AMD to eat real market share. We are now there.
Sapphire rapids is chiplet. They're just getting killed with delays, which I honestly do not understand why, I had thought they worked out their 10nm production woes.

I guess because it is their first design like this. They really really need to move past these production issues when they move to Intel 4 node.
 
Last edited:
The mid range. Yeah I see what you mean and at other levels Intel could come out ahead, but at the mid range AMD reigns.
But that's wrong. And unspecific.

I5 12th outperforms 5600/x, i7 12th has much better multicore and single core than r7. I3 12100 on the low end is far better than AMD's current offerings.

The only win amd currently has is the 5800x3D specifically, being as good as 12900k with ddr5 for cheaper, in gaming. But it is also more expensive than the i7, so definitely not mid range.
 
Last edited:

jigglet

Banned
But that's wrong. And unspecific.

I5 12th outperforms 5600/x, i7 12th has much better multicore and single core than i7. I3 12100 on the low end is far better than AMD's current offerings.

The only win amd currently has is the 5800x3D specifically, being as good as 12900k with ddr5 for cheaper, in gaming. But it is also more expensive than the i7, so definitely not mid range.

Well fuck, like I give a shit. I'm basing things off what I knew 12-18 months ago. In 2-3 months when I buy my new PC, if Intel is the best then sure I'll go Intel. I buy what's best, fuck fanboyism.
 
Last edited:

Elog

Member
Sapphire rapids is chiplet. They're just getting killed with delays, which I honestly do not understand why, I had thought they worked out their 10nm production woes.

I guess because it is their first design like this. They really really need to move past these production issues when they move to Intel 4 node.
They are absolutely working hard to catch-up. My point is that AMD has had superior products in that segment for 2 or more years but due to its slow moving nature it has taken time for this to show up in financials (but that is happening now).
 
Well fuck, like I give a shit. I'm basing things off what I knew 12-18 months ago. In 2-3 months when I buy my new PC, if Intel is the best then sure I'll go Intel. I buy what's best, fuck fanboyism.
Yeah, amd were all around better performance vs. the 10th and 11th gen Intel. It's different now.

As for your 2-3 months, Intel 13 vs amd 7000, we will see but, Intel will generally win by a lot in multicore performance with its i5/7. I5 13600k will have 14 cores vs 7600x's 6.

Pure gaming performance, we will have to see.
 

Elog

Member
But that's wrong. And unspecific.

I5 12th outperforms 5600/x, i7 12th has much better multicore and single core than r7. I3 12100 on the low end is far better than AMD's current offerings.

The only win amd currently has is the 5800x3D specifically, being as good as 12900k with ddr5 for cheaper, in gaming. But it is also more expensive than the i7, so definitely not mid range.
Generally I agree except the price differential between 5800x3D and i7 - taking motherboards etc into account that is no price difference meaning that 5800x3D is really a fantastic package in terms of price/performance.
 

jigglet

Banned
Yeah, amd were all around better performance vs. the 10th and 11th gen Intel. It's different now.

As for your 2-3 months, Intel 13 vs amd 7000, we will see but, Intel will generally win by a lot in multicore performance with its i5/7. I5 13600k will have 14 cores vs 7600x's 6.

Pure gaming performance, we will have to see.

Ok sweet. Can't wait to upgrade. I need to play Sons of the Forest @ 144hz. PUMPED!
 
They are absolutely working hard to catch-up. My point is that AMD has had superior products in that segment for 2 or more years but due to its slow moving nature it has taken time for this to show up in financials (but that is happening now).
Yes they are finally starting to lose ground in the server market. 10nm has been a huge setback for Intel's production capacity, no doubt about it.

If they had got sapphire rapids out on time this wouldn't be happening.
 
Generally I agree except the price differential between 5800x3D and i7 - taking motherboards etc into account that is no price difference meaning that 5800x3D is really a fantastic package in terms of price/performance.
Which is why I bought one! But this doesn't apply to people not already on am4.

X3D is a magnificent chip, no doubt about it.

Just how Intel needed to up its core count (which they're doing) they need to look into squeezing a lot more cache in certain CPU models like amd has done.
 
Last edited:

winjer

Member
Chiplets have been a great win for AMD. Especially for workstations and servers.
In workloads that can use a lot of threads, it has amazing performance.

MW2dCAw.png




 

Chiggs

Gold Member
I mean that statement assumes you’re an AMD fanboy? I’m switching to AMD for my next rig. It’s just better. But I will be happy to switch to whatever is best at the time. I get being fanboys of sports clubs, developers, consoles, but CPU’s? I don’t get it.

I have no idea what you’re talking about, and I own a ton of Intel stock, hence this thread.
 

DenchDeckard

Moderated wildly
The thing i don't get is my cpu is never over 75 Watts gaming according to msi after burner. I guess it flies up once I start doing stuff like cinebench?
 
The thing i don't get is my cpu is never over 75 Watts gaming according to msi after burner. I guess it flies up once I start doing stuff like cinebench?
What CPU?

That's the thing, these cpus are much more efficient doing normal tasks/gaming... Once you push all the clocks to the max on all cores, well DUH it's going to suck power. Which is what all the charts show
 

winjer

Member
The thing i don't get is my cpu is never over 75 Watts gaming according to msi after burner. I guess it flies up once I start doing stuff like cinebench?

No comparison there. Cinebench can fill all execution pipelines, on all cores.
Gaming has much lower CPU utilization.
 

DenchDeckard

Moderated wildly
What CPU?

That's the thing, these cpus are much more efficient doing normal tasks/gaming... Once you push all the clocks to the max on all cores, well DUH it's going to suck power. Which is what all the charts show

12900k it never goes above 75 /85 Watts doing anything I do.

Tell a lie I'm hitting 115 Watts in apex then it settles to like 105
 
Last edited:
Chiplets have been a great win for AMD. Especially for workstations and servers.
In workloads that can use a lot of threads, it has amazing performance.

MW2dCAw.png





I know what you intended with that slide, but my view is drawn to the 12900k beating the 5950x. I'm really excited to see the evolution of e cores, and can't see Intel losing in multicore ever again on the consumer lineup.

Chiplets, yeah that was a revolution, and I pray that Intel does not delay sapphire rapids again... which was originally extremely well positioned, but now the epyc Genoa server chips are looming.
 
12900k it never goes above 75 /85 Watts doing anything I do.
That is really surprising to hear. Goes to show how overblown all this CPU wattage talk is, and GPUs are the problem.

Edit : it was a gaming stress test, but the most I hit ony 5800x3d so far was 99 watts. Which, if I was just playing the game it wouldn't be so high.

Edit 2 : saw your edit, that did seem extremely low, I expected over 100 watts for the 12900.
 
Last edited:

DenchDeckard

Moderated wildly
Why did you go with the 12900K, if you don't use the extra cores?
The 12700K only has 4 less e-cores. But it's also unlocked, so you could clock it as high as the 12900K.

I do sometimes do some streaming and want to get back into it even more. Plus I work in the PC gaming business so I get a lot of discount and free stuff.

I was very lucky that I didn't have to really pay for much of my latest build. Which I understand I am very lucky for.
 
Why did you go with the 12900K, if you don't use the extra cores?
The 12700K only has 4 less e-cores. But it's also unlocked, so you could clock it as high as the 12900K.
I also recommend the i7 over the i9, but if money is no object, you do get more cache on the i9 so it's not just clocks.

Still, it's diminishing returns to spend that much more.
 

winjer

Member
I know what you intended with that slide, but my view is drawn to the 12900k beating the 5950x. I'm really excited to see the evolution of e cores, and can't see Intel losing in multicore ever again on the consumer lineup.

Chiplets, yeah that was a revolution, and I pray that Intel does not delay sapphire rapids again... which was originally extremely well positioned, but now the epyc Genoa server chips are looming.

You are giving to much credit to those e-cores.
Look at the 12700K. it only has 4 e-cores less, and lower clocks than the 12900K. With OC, it gets closer to the 12900K.
That's a difference of 3500 points. So the e-cores in total, on the 12900K, are worth 7000 points. Or 880 points per e-core.
So the 8 p-cores are worth 20780 points. Or 2600 points per P-core.
Those e-cores are pathetic. Intel should just put 2 or 4 more P-cores.

The real reason why the Alder Lake beats Zen3, is because it released a year later. And because the Zen4 CPU has a 5-wide execution pipeline. While Alder Lake has a 6-wide.
And since cinebench scales really well with pipelines it gets a nice edge.

cinebench-multi.png
 
Last edited:
You are giving to much credit to those e-cores.
Look at the 12700K. it only has 4 e-cores less, and lower clocks than the 12900K. With OC, it gets closer to the 12900K.
That's a difference of 3500 points. So the e-cores in total, on the 12900K, are worth 7000 points. Or 880 points per e-core.
So the 8 p-cores are worth 20780 points. Or 2600 points per P-core.
Those e-cores are pathetic. Intel should just put 2 or 4 more P-cores.

The real reason why the Alder Lake beats Zen3, is because it released a year later. And because the Zen4 CPU has a 5-wide execution pipeline. While Alder Lake has a 6-wide.
And since cinebench scales really well with pipelines it gets a nice edge.

cinebench-multi.png
It's still a significant difference with 8 vs. 4.

And, you have to think that they're going to continue adding e cores, with new architecture. 8 E cores equals a skylake i7. 16 E cores on raptor lake, is like an 9900k tacked on. You call that pathetic?

They only have 8 P cores to focus on gaming. What do you think might happen if Intel can focus on 8 big cores while amd had to scale to 32 on zen 5? If games aren't using so many cores, why add more?

I don't know why you want to root for amd anymore, it's not 2017/18.
 

winjer

Member
It's still a significant difference with 8 vs. 4.

And, you have to think that they're going to continue adding e cores, with new architecture. 8 E cores equals a skylake i7. 16 E cores on raptor lake, is like an 9900k tacked on. You call that pathetic?

They only have 8 P cores to focus on gaming. What do you think might happen if Intel can focus on 8 big cores while amd had to scale to 32 on zen 5? If games aren't using so many cores, why add more?

I don't know why you want to root for amd anymore, it's not 2017/18.

A 6700K can do just under 1100 points, in single thread test. They are significantly more powerful than these e-cores. A closer CPU would be the 3770K.

The only advantage for those e-cores is power saving. As these can save a few watts. In a laptop it's quite useful, because of the battery.
But on a desktop, it's not that important. Intel should just add more P-cores, not E-cores.
At best, the 12900K should have 10 P-cores, and just 4 E-cores.
 
Top Bottom