Why does Intel run so hot?
Why does Intel run so hot?
I wouldn't worry about it. I heard from AMD 110°C is a perfectly fine temp for silicon.
1.5V at idle is normal
110C is perfectly okay
Power consumption doesn't matter
how else am I gonna cook my food
because they keep trying to squeeze more performance out of the same manufacturing processes theyve had for 5 years because they spent all their money on diversity programs and feminist organizations
What do you think K in 9900K stands for?
Neither does temperature, apparently.
>posts GPU in a CPU thread
1 rupee has been delivered to your shitting street.
Are you retarded? It's not as though the material is any different. Silicon is silicon, my friend.
delid dis
>Based Steve exposed AyyyMD once again
>gets dislikebombed by blind AMDrones
Imagine if Nvidia said that, there will be a field day.
Keep up the good work, ranjeesh.
At least we'll know the housefire temps on the Radeon cards will be nothing to be concerned about.
T JUNCTION YOU FUCKING GOYMERS NEXT ASS SPAMMERS
Intel CPUs draw Tcase from the center of the heatspreader, unlike GPU edge guesswork.
It's not getting hotter than there.
>what is junction tempertature?
retard
Guess what you fucking retard that result was from dumbfuck Steve picking hotspots on the card. This isnt the package temp or a software reported die temp. Kill yourself fat fuck fanboy
Nice, Zen2 + Navi is God tier combo, nvidia is fine efficiency-wise but their prices are retarded.
What's that, 70° idle?
PFFFFFFHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
>that result was from dumbfuck Steve picking hotspots on the card
Where do you think the Intel CPUs get Tcase from? The very center.
You sound upset. There's no need for that.
>intel is the king of housefires again
>oy vey goyim this spot that can handle 120c in the shittiest cases to the usual of 150c is very problematic at 110c even though it doesnt affect card performance at all!
in spec tjunction temps are fucking MEANINGLESS when they dont cause throttling or any performance changes. stop fucking regurgitating what fat fuck Steve shills on his videos.
at this point amd should cut this fucking conartist hysteria spewer off.
>110C is okay - AMD
AMDrones are braindead idiots
>Don't worry about the 100000° temp guys. It's in spec. Heat doesn't matter!
Off yourself
3950X is going to pulverize Intel out of the CPU sector for years to come.
>$750 16-core beats $480 8-core
Ok
>KEK amd housefire! intel CPUs are fine with running at 100C+!
>KEK intel housefire! amd cards are fine with running at 100C+!
shills tripping over eachother
Wait for 10 core Intel next year bro.
AMD shills immediately deflect to Intel because they're an easy target but completely ignore Nvidia because they're going to get floored.
10 cores for only $600 goy, without cooler of course lol
And now back to reality
>110c tjunction is around 80c "actual" temp
>Sapphire got Tjunc temp down to 84c with a proper cooler
I mean I get it you are shilling on a imageboard but FFS no need to be so uninformed. Blower cards have shit temps thats all folks.
>Compares junction temps of one gpu against die temp of another gpu
These luddite shills never give up, do they?
They're legitimately mentally ill.
>in spec tjunction temps are fucking MEANINGLESS
And Intel certifies the 9900K as "in spec" at 100°C. What's the issue?
But 101 is not in spec
Whoa. 1°C over.
You got me.
>110 c on parts designed for at minimum 120c matters
BECAUSE that is fucking die temp and package temp and causes performance throttling and degradation.
If you dont understand the difference between junction temp and that then please stop posting and lurk more.
>and causes performance throttling and degradation.
As does Navi at 110°C, which starts dropping clocks.
Not on junction temps it does not
Are you retarded?
What's to read?
Intel has TJunciton, right in the core, at 100°C, as PH reached. "In spec", as AMD would say.
Yeah, it does.
youtu.be
>linking a cpu
On amd GPU junction temps in spec do not matter
>On amd GPU junction temps in spec do not matter
Another "doesn't matter" for the collection.
Sorry mate I can't make you fix your stupidity.
>Heat doesn't matter!
>I'm not dumb! You're dumb!
AMDrones are the gift that keeps on giving.
They're pushing their CPUs balls to the wall in order to maintain a slight sliver of a performance advantage in the workloads where they can still manage that.
this is why intel keeps information hidden from you idiots
Does Ryzen 3 work with Windows 7?
Granted amd is doing the same but amd isnt stuck on the same node as intel has been for the past years.
junction temp has NOTHING to do with actual tem p
its not like you gonna see this for more than few seconds
Yeah but why
this is bullshit
only true if you have no idea how to tune your own computer components
>stop being dumbasses you fucking dumbasses
>i9-9900k
>avg temp 80.5
Been using one for the past few months but I have yet to see it breach 40c even after stress tests and such. Just use a CPU cooler like a normal person, user.
>he was dumb enough to buy a 9900k
yikes
Why shouldn't it be okay?
OH NONONONO
>$750 16-core beats $480 8-core AND $2000 18-core
Yes. One CPU destroys not one but two Intel's lineups.
At least AMD show you the T junction temperature which Nvidia do not, for obvious reasons.
>tfw bought zen2 just to spite the jews
It's all the side-channel data sampling.
It's perfect for the right purposes. Main reason I got it was to do extensive VM work.
>Main reason I got it was to do extensive VM work
>intel
user...
Wait for 10nm+++
Yeah but AMD can't beat Intel in number of vulnerabilities.
Is higher better or lower better?
Wheres the (higher/lower is better)
this is confusing
But what if I only like AMD's CPUs?
:^)
I have an i5 8400 and an rtx 2060. Cool and comfy
it stands for the temp it runs at
9900 kalvins