DELIDING the new SOLDERED Intel CPU is worth it

DELIDING the new SOLDERED Intel CPU is worth it
Laping too!
All testing done with Noctua NH-D15
youtu.be/r5Doo-zgyQs

Attached: Screenshot_YouTube_20181020-115557.png (2880x1440, 207K)

Other urls found in this thread:

techspot.com/downloads/4965-intelburntest.html
twitter.com/SFWRedditGifs

Numbers for 9900K

Attached: Screenshot_YouTube_20181020-115117.png (2880x1440, 152K)

What the fuck

grind this

Attached: brian JUST.jpg (921x865, 354K)

intel is a joke at this point. if there is no competition there is no need to innovate and now they can't get out of that loop

>there is no competition

You got that part right at least.

Their competition is themselves. Give me a good reason to move on from 3770k at 1080p.

well yeah the last amd cpu that was such a blast furnace was the fx 9590 and 9900k managed to smoke it literally

no reason unless you're a (((content creator)))

DELAP DIS!

finally intel burn test techspot.com/downloads/4965-intelburntest.html
has literally found something to burn

>intel is a joke at this point. if there is no competition
Have you been under a rock for the past two years? Ryzen is giving them a run for their money. The i9 is an absurd knee jerk reaction to AMD's current offerings.

And to make things worse, they fucked up their 10nm process, so they can't even put out a real new line of CPUs until like 2020.

Are there similar numbers for Ryzen anywhere?

>reading comprehension

Yes, from the same dude, liquid metal will always be better than solder, not sure what the point of this thread is

Attached: Capture.png (1327x629, 79K)

Not really, it gives marginal returns at best versus risk that you can easily kill your chip with a de-lidding atempt

De-lidding soldered CPUs without killing them is difficult. The irony is that extreme overclockers actually prefer the previous TIM setup since it made de-lidding relatively safe and you have complete control what TIM or liquid metal to use.

>liquid metal will always be better than solder
That being said, though, the temperature difference on Intel is apparently twice that on AMD.

>De-lidding soldered CPUs without killing them is difficult.
Watch his video. The Indium layer is apparently soft (and/or think) enough that he can delid them the exact same way as with paste TIM. He claimed to have done that procedure with 8 different CPU samples.

>the temperature difference on Intel is apparently twice that on AMD
Welcome to physics and basic math

>think
thick*

Please tell me what law of physics dictates that one CPU brand's TIM conducts heat better than that of another brand.

>Implying that most DIY have the proper tools to "safely" delid a soldered CPU.

There's a reason why de-lidding prior to the the whole TIM/IHS setup was a niche only for extreme overclockers who were trying to get world records with suicide runs.

> Within these next couple of weeks, you will see an influx of kiddies killing their 9900K and 9700 by a botched de-lidding attempt.

>delid guy trying his hardest to salvage his delid and liquid metal business
(lol

If you're paying $530 for a processor paying for a 280mm or 360mm AIO is a drop in the bucket. A ton of people out there buy Intel k parts and never overclock them. Most of them don't run stressful applications either and majority will only be running games.

There are a few videos out there with the 9900k paired with all the RTX cards and it hits 50C-60C on average while gaming, that's like normal.

>There's a reason why de-lidding prior to the the whole TIM/IHS setup was a niche only for extreme overclockers who were trying to get world records with suicide runs.
Of course. What I'm saying is that there is something different about these CPUs that allow them to be delidded without melting the solder.

They've been putting snot in their CPUs for so long that they forgot how to solder the heatspreader on properly.

That may very well be, but that's not "physics and basic math".

true, Intel goes for rich kids that can spend 1k on a cpu
Amd always aims for a lower price point, those other kids are all from India and cant afford Intel

Nope, you kiddies have been memed by idiots who didn't understand why Intel switch to TIM in the first place and how the whole Ivy Bridge/ first batch of Haswells debacle happened.


TL DR version, Intel was paranoid of dealing with their version of "bumpgate" due to concerns of long-term thermal cycling/material fatigue with solder on post-32nm chips. They switched over to TIM but the packaging equipment at fab was tuned to soldered chips which caused QC issues. Intel rectify this with second-batch Haswell a.k.a Devil's Canyon and onward.

delid and grind dis

nu-uh

You just can't win with kiddies

>Intel stopped soldering
WAAAH INTEL BRING BACK SOLDERING, I HATE TOOTHPASTE
>Intel brings back soldering
WAAAH INTEL YOUR SOLDERING SUCKS, I WANT MY TOOTHPASTE BACK FOR EASY DELID
>Intel shits out a 5Ghz mainstream processor out of the box
WAAAH INTEL FUCK YOU I CAN'T OVERCLOCK TO 5.1 AND 5.2GHZ WITHOUT 90C

Intel is just retarded in general.

YOU NOW HAVE TO BOTH CIRCUMCISE AND CHOP PARTS OF YOUR CPU

BRAVO INTEL

Well it doesn't change the fact that their soldering objectively sucks. I wonder if it's some sort of patent issue that makes them use some sort of inferior solution.

It is what actually happened if you read between the lines and saw what was happening at the time.

Intel had hard data on the impact of bumpgate since most of the affected Nvidia GPUs were on Intel system via OEMs. Intel had to indirectly replace systems but Nvidia ate the cost.

The executes took the concerns of their material/chip packaging engineers seriously. The bean counters were on board since it reduced production costs.

Notice how Intel switched to TIM right after Bumpgate debacle was settled? Not a coincidence.

If reducing production costs was a such pressing concern. Intel would have moved away from soldering during the Netburst versus K8 days.

How would nVidia GPUs be analogous, when they've never been soldered or even had heatspreaders?

96.5 to 88.5 = 8% difference
60 to 64 = 6% difference

That's not twice the temp difference, and with the way thermodynamics work the higher the temp the more difficult it is to conduct it.

>That's not twice the temp difference
But temperature conducts with the absolute difference, not the ratio. See Fourier's law
>with the way thermodynamics work the higher the temp the more difficult it is to conduct it
Wat. Generally speaking in metals, thermal conductivity increases with higher temperatures. Also black-body radiation (though probably not very applicable here) increases with the cube of the absolute temperature. Not sure what phenomena you could possibly be referring to that display the inverse relations.

Doesn't the rate of temperature change slow down the smaller the delta is?

>96.5 to 88.5 = 8% difference
>60 to 64 = 6% difference
>Ratio of deltas above arbitrarily chosen point of zero

Attached: a.jpg (500x282, 48K)

But their solder does suck. AMD's chips are voltage limited rather than temperature limited so they don't have the same problem. They also don't hit 90 C out of the box because they have half decent coolers and aren't clocked to Netburst levels.

Bumpgate happened because RoHS standards were being enforced around mid to late 2000s. The lead-free soldering substitutes that they were using on the pins was prone to wrapping and "oozing" out from thermal cycling.

Coincidentally, Nvidia's mobile GPUs at the time suffered the worst of it, because they ran very hot at load and cool down rapidly at idling. The problem also affected ATI mobile GPUs at time but it was to a lesser degree. Nvidia only got in trouble because they tried to bury the problem instead of being upfront about it. They had to rework the lead-free substrate to find something wasn't as prone to the "oozing/cracking" issue.

Material engineers at Intel feared that similar issues could arise with soldering/IHS with smaller and denser chips in long-term usage patterns. The solder could start to cracking/wraping the substrate on the silicon which would end-up killing it.

Intel executives saw the fiscal (a couple billion USD) and PR damage that bumpgate inflicted on Nvidia and decided to heed the concerns of their material engineers and that's how TIM/IHS setup came into being.

how is intel so incompetent? this beggars belief

...

THE HIGHER THE BETTER
THE MORE YOU BUY THE MORE YOU SAVE

I don't think this had as much to do with incompetence as it did fear.

They are scared of AMD and released an entirely unnecessary generation of consumer processors to provide the impression of progress and distract from the 10nm dumpster fire in-progress. The Ringbus architecture and 14nm+++++ can only be pushed so far... the power consumption / heat generated by the 9000 series is just Intel trying to push the limits further on an architecture that has already been squeezed dry. Of course, they also want to fleece the consumers who are still willing to purchase this garbage at full price, which is why they rebranded the i7 into an i9.

The real incompetence at play is how poorly Intel mismanaged their 10nm development and how fucked their CPU roadmap is.

DIE LAP DIS

see
if you want to do percentages, use kelvin.

Intel has no roadmap after this, the lower end processors won't be out until Q1 next year

When Ryzen 3 comes out next year what are they going to release on October 2019? A cooler refresh of the 9900k rebranded as the 10900k?

A brand new architecture, made out of a project they previously ignored on some shitty country.

Attached: retail_box_1.jpg (300x249, 17K)

1. You want a better performing processor for real-world engineering applications.
2. You are not a gaymerfag
3.... do I need to go on?

>When Ryzen 3 comes out next year what are they going to release on October 2019?
By that time Intel's 10nm will be functioning and far beyond anything AMD will be able to match.

>if there is no competition there is no need to innovate
Intel has competition, and lots of it. And it's not AMD, never was. It's TSMC, Samsung and GF. Intel designs chips, just like AMD, but they business is making chips. TSMC and Samsung are on 7nm and Intel can't into 10nm, they are stuck on 14nm++++. Thus, we now have mid-range Intel CPUs that run at 97C and only perform well on high-end boards.

Not really that relevant, Ryzen CPUs just don't run that hot. And they probably can't, the Ryzen 2000 series CPUs tend to start throttling at 70C.

>AIO is a drop in the bucket
or eventually a drop or dozen on your GPU and perhaps even your motherboard. It happens, I've seen it and that's when I decided to never ever use anything with liquid inside my computer case.

Intel's 10nm will only start "real" mass production in 2019 and it can extend indefinitely again.

It will also take them a year or two to design a CPU around it and if they rush it their first 10nm CPU will be a disaster. At that point AMD will have 7nm+ Ryzen 4 on AM5.

Desperate intel can do things they normally wouldn't , such for example contracting TSMC to make their chips.

Attached: 1458875887214.gif (640x360, 1.35M)

>new i7s are on TSMC 7nm
Kek

AyyyMD will be finished

It will be a nice fight until AMD have some stupid idea, falls behind and intel sits on their butt and do nothing for more 5 years.