CPU Intel Core 9000 Series Processors Discussion

SashaLag

SFF Lingo Aficionado
Jun 10, 2018
127
111
it's bad for SFF
it isn't... It's obvious SFF would be hit by this "upper limit TDP war"... But noone is forcing anybody to run this CPU full throttle (=at turbo frequencies)! If you want, you can still run these at lower TDP! You just have choice to to do it know. With proper cooling, you can have more performance. Otherwise, more efficient focused CPUs (or TDP) are available...
 

Choidebu

"Banned"
Aug 16, 2017
1,199
1,205
AMD is also cheating with TDP for the first time this generation as explained by The Stilt (emphasis mine)

..snip..

I guess this is all a by-product of the Core Wars and it's bad for SFF. Again, I'll try to run some power limited tests and see honest performance when I have time.


First thing I learn when going sff is the fact that tdp != power consumption but it never stray too far off, e.g 10W more for lower tiers and 30W more for higher tiers. So it was easy enough to put ballpark figure out.

I rely on tech reviewers for upper limits, but now their role seems indispensable for this.
 

HottestVapes

SFF Lingo Aficionado
Oct 13, 2018
135
131

Allhopeforhumanity

Master of Cramming
May 1, 2017
545
534
ComputerBase have done some tests with the 9900K capped at 95w. In terms of gaming, there's a 1% difference in FPS (8% in other tasks).

https://translate.google.com/translate?hl=en&sl=de&u=https://www.computerbase.de/&prev=search

This can be a little misleading too though because it doesn't tell you which titles are used in the comparison or at what resolution this comparison was made. I suspect that the titles were majorly GPU bound if "high load applications" saw 100w higher power draw. There is simply no way a system runs at half the power with only 1% difference in output FPS otherwise.

What this does go to show though is that for the majority of gaming situations, the 9900K is complete overkill, and under SFF conditions, you're probably better off with a cooler running 9700k or 2600x.
 
  • Like
Reactions: Phuncz

HottestVapes

SFF Lingo Aficionado
Oct 13, 2018
135
131
This can be a little misleading too though because it doesn't tell you which titles are used in the comparison or at what resolution this comparison was made. I suspect that the titles were majorly GPU bound if "high load applications" saw 100w higher power draw. There is simply no way a system runs at half the power with only 1% difference in output FPS otherwise.

What this does go to show though is that for the majority of gaming situations, the 9900K is complete overkill, and under SFF conditions, you're probably better off with a cooler running 9700k or 2600x.

True, and more information on what games they tested at 95w is needed, however I don’t think it’s unlikely to see only a small percentile drop in performance when capping the CPU to 95w as it doesn’t use much more than that while gaming at stock settings with MCE disabled.

GamerNexus plotted the power usage while playing and streaming Fortnite for instance and if you haven’t seen it then the results may surprise you



 

Hifihedgehog

Editor-in-chief of SFFPC.review
Original poster
May 3, 2016
459
408
www.sffpc.review
True, and more information on what games they tested at 95w is needed, however I don’t think it’s unlikely to see only a small percentile drop in performance when capping the CPU to 95w as it doesn’t use much more than that while gaming at stock settings with MCE disabled.

GamerNexus plotted the power usage while playing and streaming Fortnite for instance and if you haven’t seen it then the results may surprise you



Careful... I know I will get some nodding heads from this, but hear me out. I appreciate their tireless efforts in journalistic integrity and brand neutrality. Commendable as they are, GamersNexus is not—bold statement, but repeat—not the ultimate authority in technical know-how and experimental accuracy. Take this example from recent memory. With one of Adobe’s major spring 2018 updates, Premiere brought Intel hardware video encoding acceleration support to the table. Here, GamersNexus made a big mistake in concluding this was a unilateral step forward. They benchmarked the performance of Intel 8th Gen with this enabled compared to AMD Ryzen 2000 using the default software encoder. Fair? Hardly so. That hardware acceleration happens to be Quick Sync Video (QSV) which is still highly debated for its fidelity in streaming and podcasting circles. If they were on top of things, they should have gone back and also used Voukoder in Premiere enabling NVENC for the GTX 10 series graphics card they use in their test setup to provide another option in their test results. Arguably, QSV, even in its most recent revision, still is a far cry in video quality from a software encoder like the videophile standby x264. This example also bears some relevance here. I wonder if QSV is being used here in the streaming test as well. If so, that would account for the major difference we are seeing in the encoding workload power draw between AMD and Intel in the GamersNexus result set.

Source(s):
https://www.gamersnexus.net/guides/3310-adobe-premiere-benchmarks-rendering-8700k-gpu-vs-ryzen
 
Last edited:

Allhopeforhumanity

Master of Cramming
May 1, 2017
545
534
True, and more information on what games they tested at 95w is needed, however I don’t think it’s unlikely to see only a small percentile drop in performance when capping the CPU to 95w as it doesn’t use much more than that while gaming at stock settings with MCE disabled.

GamerNexus plotted the power usage while playing and streaming Fortnite for instance and if you haven’t seen it then the results may surprise you

Yeah, I saw those results from GN. 95W is probably all that's needed for a title like Fortnite which isn't the most demanding on CPU. I imagine that it will vary quite a bit title to title though.

I'd say that my previous comment still stands though, that if all you care about is gaming and want a SFF build, the 9900k probably isn't the optimal choice. But like you, I would like to see further testing on how the chip scales with lower power, clock stability, and across workloads and game engines.
 

Allhopeforhumanity

Master of Cramming
May 1, 2017
545
534
Translation: Intel recommended motherboard manufacturers to run 9th gen processors out of spec (PL2 = 210W as a default), in order to prevent downclocking and to look better in benchmarks at the expense of power consumption and temperature.

This is pretty shady on Intel's part but not a single reviewer bothered to read the datasheet.

That's pretty big news, both in the changes to the spec and to the fact that it wasn't communicated to any reviewer.

I don't necessarily expect most reviewers to read through the 140 page document when they are often given such limited time to perform testing though. I am glad that it's coming to light however, since it is an important change over their previous gen.
 
  • Like
Reactions: Ionrent

Hifihedgehog

Editor-in-chief of SFFPC.review
Original poster
May 3, 2016
459
408
www.sffpc.review
insider Zen 2 performance
That is literally the only thing I can surmise as well. Plus, the recent pending sell-off of IM Flash to Micron leads me to believe they are already hurting, that they are doing everything humanly possible to perform balance sheet origami. As they say, desperate times call for desperate measures and such is par for the course for Intel for the last two years. Intel already pulled some crazy stunts with X299 when they knew ThreadRipper was coming, and I imagine this is yet another desperate reaction and this time for Zen 2.

By the way, Ian Cutress confirmed as much about the default PL2 setting for 9th Gen.
 

tinyitx

Shrink Ray Wielder
Jan 25, 2018
2,279
2,338
how does one translate or match a CPU's deltaT at load to a cooler's Thermal Design rating? o_o
What CPU's Delta T are you referring to?

As far as matching a CPU with a cooler, I think we can still use the TDP values. I bet, big cooler manufacturers like Noctua or Cooler Master must understand Intel's jargon much better and will not be messed up and design their coolers unmatching their target CPUs.
 
Last edited:

VegetableStu

Shrink Ray Wielder
Aug 18, 2016
1,949
2,619
What CPU's Delta T are you referring to?
nonspecific in general? o_o like imagine Intel pulls all TDP info from ark.intel.com

I mean noctua has a TDP chart with fuzzy ratings for CPUs (and overclocking), but as for the rest how does one roughly extrapolate if a heatsink would cool a (say) 9700K with just the TDP rating from the manufacturer and no reviews?
 

HottestVapes

SFF Lingo Aficionado
Oct 13, 2018
135
131
Yeah, I saw those results from GN. 95W is probably all that's needed for a title like Fortnite which isn't the most demanding on CPU. I imagine that it will vary quite a bit title to title though.

I'd say that my previous comment still stands though, that if all you care about is gaming and want a SFF build, the 9900k probably isn't the optimal choice. But like you, I would like to see further testing on how the chip scales with lower power, clock stability, and across workloads and game engines.

Keep in mind it was while streaming and playing at the same time, which is tad more demanding than just playing the game on it's own. Regardless, I agree completely that the 9900K is overkill for just gaming. It's retailing for over £600 here, which is alone is an absurd amount to spend on one component of gaming rig that isn't a GPU.

Careful... I know I will get some nodding heads from this, but hear me out. I appreciate their tireless efforts in journalistic integrity and brand neutrality. Commendable as they are, GamersNexus is not—bold statement, but repeat—not the ultimate authority in technical know-how and experimental accuracy. Take this example from recent memory. With one of Adobe’s major spring 2018 updates, Premiere brought Intel hardware video encoding acceleration support to the table. Here, GamersNexus made a big mistake in concluding this was a unilateral step forward. They benchmarked the performance of Intel 8th Gen with this enabled compared to AMD Ryzen 2000 using the default software encoder. Fair? Hardly so. That hardware acceleration happens to be Quick Sync Video (QSV) which is still highly debated for its fidelity in streaming and podcasting circles. If they were on top of things, they should have gone back and also used Voukoder in Premiere enabling NVENC for the GTX 10 series graphics card they use in their test setup to provide another option in their test results. Arguably, QSV, even in its most recent revision, still is a far cry in video quality from a software encoder like the videophile standby x264. This example also bears some relevance here. I wonder if QSV is being used here in the streaming test as well. If so, that would account for the major difference we are seeing in the encoding workload power draw between AMD and Intel in the GamersNexus result set.

Source(s):
https://www.gamersnexus.net/guides/3310-adobe-premiere-benchmarks-rendering-8700k-gpu-vs-ryzen

I understand completely, I don't take anything as gospel from any reviewer or benchmark. Since Friday I've been trying to gather as much information from as many different sources that I can to get a better idea of all the information out there regarding the power draw and thermals of the 9900K.
 

Hifihedgehog

Editor-in-chief of SFFPC.review
Original poster
May 3, 2016
459
408
www.sffpc.review
This is starting to look really bad. Golem.de tested the 9900K with in-spec power limits and couldn't get the processor to maintain advertised all-core frequencies.


When they bumped up the limit to 200W, suddenly things changed:

Not an insignificant difference in performance:


Look at the power consumption:


There's a bunch more if you click through the image table.
Interesting. Compared to the 2700X, they already have a definite lead, no doubt about it, when at the base TDP. For some reason, though, they seem to be pushing out much further the performance than appears necessary. It could very well be they are working to narrow the possible gap or deficiency they may have with Zen 2’s launch in six months from now.
nonspecific in general? o_o like imagine Intel pulls all TDP info from ark.intel.com

I mean noctua has a TDP chart with fuzzy ratings for CPUs (and overclocking), but as for the rest how does one roughly extrapolate if a heatsink would cool a (say) 9700K with just the TDP rating from the manufacturer and no reviews?
About the closest thing to that would be FrostyTech’s or HardwareSecrets’ outdated charts, and our own @IntoxicatedPuma’s charts that he is even now amassing as a Herculean effort.
 
Last edited:
  • Like
Reactions: VegetableStu

MarcParis

Spatial Philosopher
Apr 1, 2016
3,669
2,784
Following unconsistent reviews about power consumption of core i9 9900K, I guess "ninja" settings (i mean default) from motherboards were freeing all power restrictions of this cpu that could run pretty hot..:)

I was thinking of similar "multicore enhancement" from Asus that forces all cores to operates at same speed by default.

By the way, I quite like this generation of intel cpu, even if they will struggle in pure sff cases where cpu cooling is pretty light..:D

I'll wait for mid 2019 to decide on my future cpu platform (Zen 2 vs 10nm Intel?)...2019 will be so interesting in cpu market...;)