• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - Intel Core Ultra 9 285K Gaming Performance: There Are Serious Problems



00:00 Introduction: Why We Need More CPU Performance
01:29 Core Ultra 9 285K: Productivity is Great, Gaming Performance Isn't
02:40 DF's Take On Automated Benchmarking
04:12 Cyberpunk 2077 Bugged? A Performance Collapse
05:27 Starfield Benchmarks
06:28 Dragon's Dogma 2 Benchmarks
07:06 Baldur's Gate 3 Benchmarks
07:33 Microsoft Flight Simulator 2020 Benchmarks
08:42 F1 2024 Benchmarks
09:48 Forza Horizon 5
10:09 Conclusions + More Testing
11:31 Goodbye For Now

Leonidas Leonidas

Oh No Fire GIF
 
Last edited:

MikeM

Member
Wtf was Intel thinking releasing this. Some serious leadership changes need to happen there.
 

Magic Carpet

Gold Member
Every review I've seen said it's best for productivity and worse for games. Is Intel marketing this processor as a Gaming Processor?
 

rm082e

Member
Wtf was Intel thinking releasing this. Some serious leadership changes need to happen there.

They can't afford not to. It's not realistic to make a whole new architecture and then chuck it in the trash just because it's behind the competition on one aspect of performance. They'll improve on this over time and we'll see how it goes.
 
  • Like
Reactions: Det

Bojji

Member
Wtf was Intel thinking releasing this. Some serious leadership changes need to happen there.

Performance of CPUs itself is problematic but windows 11 problems are another layer to this. 24h2, current version of windows 11 performs like shit, 23h2 is better but in this one (without patch) AMD performance was worse...

Windows 10 is the best platform to test but also old and not "current".
 

Sethbacca

Member
Performance of CPUs itself is problematic but windows 11 problems are another layer to this. 24h2, current version of windows 11 performs like shit, 23h2 is better but in this one (without patch) AMD performance was worse...

Windows 10 is the best platform to test but also old and not "current".
Windows 11 is fucking awful in general. This would be the absolute best time for Valve/Steam to release an official desktop focused distribution of its OS.
 
Last edited:

Bojji

Member
Windows 11 is fucking awful in general. This would be the absolute best time to release an official desktop focused distribution of its OS.

I tried to return to windows 10 2 weeks ago but couldn't (I'm too used to UI after 2 years), also had some stupid issues. Windows 11 24h2 is not bad for me (5800x3d) but Intel owners should stay away from the newest version for sure.
 

zeroluck

Member
All these CPUs are god damn boring, no real improvement for 2+ years now, and actual regression in $/perf when older CPU is much cheaper.
 

LiquidMetal14

hide your water-based mammals
It just needs to be mere few % faster than 7800x3d and everyone will love it thanks to failed launches of Z5 and AL.

We had massive gains in Z2, Z3, Z4 and even on Intel CPUs in recent years. CPU world becomes boring again.
Hey now, let's just shit on Intel's incompetence. Regardless of what happens, this is a piece of shite with lower power envelope than the last parts. You gotta give them the fart in church at least.

The 7800X3D is already head and shoulders (knees and toes) ahead of this malarky.

Let the sons of bitches laugh.
 

Nvzman

Member
Hey now, let's just shit on Intel's incompetence. Regardless of what happens, this is a piece of shite with lower power envelope than the last parts. You gotta give them the fart in church at least.

The 7800X3D is already head and shoulders (knees and toes) ahead of this malarky.

Let the sons of bitches laugh.
Only if you play games though, for any sort of workload stuff the 7800x3d is far from the best. Additionally, as I said in a previous thread, the results are far, far closer and the performance difference for gaming becomes irrelevant if you game at 1440p or 4k. Once you go to higher resolutions and settings most new CPUs are going to perform basically equally, but that obviously is why they aren't benchmarked that way.

relative-performance-games-2560-1440.png
alan-wake-2-rt-2560-1440.png
spiderman-rt-2560-1440.png



Arrow lake is definitely a disappointment for gaming performance, but the power efficiency gains are notable, and I think this paves the way for future generations to be far more competitive. IIRC the reason why arrow lake's performance is so disappointing is due to last minute cuts to the ring bus because of technical problems. I think eliminating hyperthreading also hurts gaming performance a lot in some titles.

What really matters with CPUs at this point is value, you just have to pick whatever works best within your budget, this benchmark warrioring bullshit people do in threads/discussions to humiliate Intel or AMD is rediculous because for gaming, it never actually reflects reality, because nobody is just going to play at 1080p with a 4090 (which is what TPU used btw) and 7800x3d and nobody is also going to play games at low settings 720p either. For real world use, it barely matters.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Every review I've seen said it's best for productivity and worse for games. Is Intel marketing this processor as a Gaming Processor?
It is best for productivity but it is not the best at it either… plus it launches with stability issues with the last Windows update (24H2)… :/.
 

LiquidMetal14

hide your water-based mammals
Only if you play games though, for any sort of workload stuff the 7800x3d is far from the best. Additionally, as I said in a previous thread, the results are far, far closer and the performance difference for gaming becomes irrelevant if you game at 1440p or 4k. Once you go to higher resolutions and settings most new CPUs are going to perform basically equally, but that obviously is why they aren't benchmarked that way.

relative-performance-games-2560-1440.png
alan-wake-2-rt-2560-1440.png
spiderman-rt-2560-1440.png



Arrow lake is definitely a disappointment for gaming performance, but the power efficiency gains are notable, and I think this paves the way for future generations to be far more competitive. IIRC the reason why arrow lake's performance is so disappointing is due to last minute cuts to the ring bus because of technical problems. I think eliminating hyperthreading also hurts gaming performance a lot in some titles.

What really matters with CPUs at this point is value, you just have to pick whatever works best within your budget, this benchmark warrioring bullshit people do in threads/discussions to humiliate Intel or AMD is rediculous because for gaming, it never actually reflects reality, because nobody is just going to play at 1080p with a 4090 (which is what TPU used btw) and 7800x3d and nobody is also going to play games at low settings 720p either. For real world use, it barely matters.
U.N.B.E.L.I.V.A.B.U.R.G.E.R.
 

winjer

Gold Member
Only if you play games though, for any sort of workload stuff the 7800x3d is far from the best. Additionally, as I said in a previous thread, the results are far, far closer and the performance difference for gaming becomes irrelevant if you game at 1440p or 4k. Once you go to higher resolutions and settings most new CPUs are going to perform basically equally, but that obviously is why they aren't benchmarked that way.

Arrow lake is definitely a disappointment for gaming performance, but the power efficiency gains are notable, and I think this paves the way for future generations to be far more competitive. IIRC the reason why arrow lake's performance is so disappointing is due to last minute cuts to the ring bus because of technical problems. I think eliminating hyperthreading also hurts gaming performance a lot in some titles.

What really matters with CPUs at this point is value, you just have to pick whatever works best within your budget, this benchmark warrioring bullshit people do in threads/discussions to humiliate Intel or AMD is rediculous because for gaming, it never actually reflects reality, because nobody is just going to play at 1080p with a 4090 (which is what TPU used btw) and 7800x3d and nobody is also going to play games at low settings 720p either. For real world use, it barely matters.

The power efficiency gains are only notable if compared to 13th and 14th Gen.
But compared to AMD, it's very bad. The 285K not only is significantly slower than a 7800X3D, but it uses almost double the power.
And it gets much worse, when we consider that the 7800X3D is made in TSMC's N5 node. While the 285K is made in N3B.
It's quite remarkable how Intel can make a CPU in a much more advanced node, and yet have much worse power efficiency.

Average frame rates differences might be less noticeable at higher resolutions, but they can still happen.
But what still happens a lot, is that 1% lows are still very noticeable, even at higher resolutions. So having the best CPU, will mean a smoother experience, with drops that are not as big.
And in this case, both Intel's prior generations and AMD's Zen4 and Zen5, are better.
 
Finally time to upgrade my 12400?


Nope.
This little fucker serious doesnt want to die.
Time to wait for another generation.

At this rate I'm gonna have had run this X470 board an entire console generation and won't need to upgrade from this X470 & 5800X3D until next gen when consoles will likely be running Zen 6 or something :messenger_grinning_sweat:

even a decent CPU a few generations old looks like it will have legs for a while with these new CPU's not exactly knocking it out of the park (outside of newer X3D)
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You can always upgrade to... AMD

amd-intel.gif

I recently upgraded to a 4K panel, so there really isnt any CPU worth the price of admission and a new motherboard.

If anything id probably get a 147K so I can stick to LGA-1700 till the next generation of consoles.
 

YOU PC BRO?!

Gold Member
It’s going to go from bad to worse for Intel when the 9800X3D launches in a few weeks… It gets worse, the new Intel chips are a bad deal from a value perspective considering the new socket also. The only silver lining is that Intel are finally taking steps to reduce the insane power draw of their chips.
 

SonGoku

Member
Additionally, as I said in a previous thread, the results are far, far closer and the performance difference for gaming becomes irrelevant if you game at 1440p or 4k. Once you go to higher resolutions and settings most new CPUs are going to perform basically equally, but that obviously is why they aren't benchmarked that way.
The thing that makes this average fps comparison very misleading is lack of low fps data which is what determines how fluid the gameplay experience feels, the main appeal of high end CPUs in gaming is to deal with these 1% and 0.1% low fps bottlenecks

These type of average fps compilations also miss outlier games that are very CPU limited, its like those average RT comparison which show RDNA3 not that far behind nvidia because it includes games where RT isn't used heavily
 
Top Bottom