When did this meme become about Intel? I fully recall it being AMD processors heating up like all shit

When did this meme become about Intel? I fully recall it being AMD processors heating up like all shit.

Attached: 1493923646354[1].jpg (1032x774, 273K)

Other urls found in this thread:

ark.intel.com/content/www/us/en/ark/products/186605/intel-core-i9-9900k-processor-16m-cache-up-to-5-00-ghz.html
technologyreview.com/s/612989/chips-may-be-inherently-vulnerable-to-spectre-and-meltdown-attacks/
newegg.com/Product/Product.aspx?Item=N82E16814930006
cnet.com/news/intel-to-pay-amd-1-25-billion-in-antitrust-settlement/
en.wikichip.org/wiki/amd/cores/picasso
twitter.com/SFWRedditImages

I dont even feel like explaining this
you should know how this meme works
straightforward stuff

I understand how the meme works, I don't understand when it stopped being about AMD and started being about Intel instead.

Intel has been a reliable housefire supplier since at least NetBurst and Prescott.

when amd introduced zen architecture and intel started taking the approach amd used to take with fx cpus.
lurk moar

Basically somewhere along the line intel disregarded the TDP metric which was supposed to mean max heat generated under any workload. For example their stock i9-9900K consumes over 200W on a blender 3D rendering load (ie not a power virus) despite it having an intel approved "95W" TDP. OC'd to 5GHz it can consume 250-300 watts. This has not only duped consumers but even Z-series motherboard manufacturers which have made motherboard than can't safely feed more than ~150W to the CPU which has caused i9-9900K processors even with industrial refrigent water chillers to heavily throttle to keep the VRMs on the motherboard from blowing up.

In addition to that intel has been using low quality paste TIM between their CPUs and IHSs which increase temperatures by 10-20C. The only way to "fix" this is by performing a dangerous deliding operation and replacing the stock IHS TIM with electrically conductive liquid metal which has been known to kill processors if even a tiny portion of a component contact has been left unprotected near the CPU die. Intel tried to fix this by introducing a soldered IHS in i9 processors but they cheaped out and temps are around 5C+ higher than they should be.

People go through all this malarchy just so their bibeo gays can run 5-10% faster than on an AMD ryzen system with fast low latency RAM.

ark.intel.com/content/www/us/en/ark/products/186605/intel-core-i9-9900k-processor-16m-cache-up-to-5-00-ghz.html

tl;dr avoid inhell

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9DLzkvODA1MjU3L29yaWdpbmFsL2ltYWdlMDA3LnBuZw==(1).jpg (755x564, 110K)

I believe its because Intel is now diving into the MOAR COARS meme just to one up AMD who are comfortable with 8 cores since they can only do 4ghz anyway, while still being stuck on the same node, combine this with the 5ghz boost clock and you get your own personalised nuclear reactor all for under 900 dollaroos

in a nutshell, the 7000 series of intel gained an impressive ability to get very high clocks with almost no effort. The way the silicone worked it was not really designed with those high clocks in mind, so they consume an exponentially high amount of power when reaching those high clocks.

Intel tried to hide this by posting bogus TDP ratings that were easily bypassed and encouraged to do so by mainboard manufacturers. So despite the low rating on the box, most intel processors 'running not as intended' actually consume WAY more power than they are specced for.

*temps are around ~10C higher than they should be

Attached: 19102414455l.jpg (1918x1033, 601K)

amd faggots kept on getting btfo every single day on this board and elsewhere. they don't like being reminded that even if an amd had 100x cores on a die, not one of them could match the performance of a single core in an intel chip.

This, it's not like using faster RAM will make a poozen processor 95% as fast as a superior intel©®™ processor with 500MHz higher clock frequency advantage.

Attached: Ryzen-gaming-memory-opt_1.png (641x483, 53K)

Pentium 4 423 socket and Prescott.

>pay the Intel tax
>can use regular RAM
>pay AMD
>yeah you bought a cheaper processor but you have to pay the Samsung/SK Hynix tax
>your AMD processor is still slower than Intel, even in AMD-sponsored games
>throw moar corez to compensate
>Intel releases a similar product with less cores than the newest AMD processor and its faster than their previous generation CPU and the newest AMD CPU

AMJUST

What is Netburst
What is Fermi

Don't mention games, you'll anger the """PRODUCTIVITY""" losers who LARP about doing work-related tasks while sitting on Jow Forums all day.

>pay the intel tax
>have to use $200+ Z motherboard that supports intel's """""95W TDP"""""
>have to use $200+ tripple fan AIO to keep it from having a meltdown ;^)
>SSD performance literally cut in HALF after all the security vulnerability mitigations are applied
But hey you got that 5% bibeo gay advantage, amirite?

Attached: meltdown_crystal_disk_mark_6_4k_read_q32_t1_broadwell_xps13_corei5-100747467-large.jpg (700x421, 33K)

Don't worry AMD gpus are still housefires.

Weird, I never had to do any of that.

that's just you sweetie

Because amd cpu weren't hot. They had huge tdp, but you never had any troubles with removing heat with decent cooler. You slap your dh14 on it and enjoying your 65C cpu while your room also starting to heat up to the very same 65C. The thing about latest intel cpus is that you simply cannot remove heat after certain point even with cooler surface being ice cold without grinding silicon off from the cpu and making space between cpu and ihs thinner.

It keeps switching.
When the company start to lose, they start to ramp up the clocks and voltages to keep up with.

>I overclocked my AMD 8350 from X to X and it never broke >low temp
Well maybe because the fucking temp readings were bogus for the whole Bulldozer lineup, no matter what monitor software you used

Because FUCK SOLDER WE NEED MOAR NIGGAHURTZ

Attached: 1506977173618.jpg (882x758, 324K)

When Intel started to market their CPUs as 95/135W yet get to 230W+ while AMD is at less than 100W. Which is hilarious because the only way AMD loses in benchmarks is when Intel uses twice as much power and heat, or when the benchmarks are faked due to (((genuine intel))) garbage. And all that to give Intel 5-10% performance above AMD. This is the same shit AMD did before Ryzen, except they never lied about TDP.

Will AMD ever top Intel's chiller stunt?

If the flip flop keep going, i imagine Ryzen 4/5 will be your burn to the ground baby.

1771w dude
how can amd even top that shit? unless you OC all 64 cores to 5GHz

When they glued 8 cores on 14nm++++++

One day it will be normal to walk into a server room and be greeted with the sweet smell of burning plastics and silicone. "I love the smell of burning silicone in the morning" says the Sysadmin.

I
LOVE
THIS COMPANY

I'm talking shit with 256 cores/8 layers here.

>I fully recall it being AMD processors heating up like all shit.
Hello newfriend.
It seems you don't remember Prescott. Reminder that you must be 18 to post here.

Paying $10 more a year on my electricity bill for better performance in games is worth it though.

Used Prescott for 15 years, can confirm, it was a piece of shit housefire and summer was absolute suffering.

(((Intel))) CPUs cost twice as much so you're paying 100$/year (assuming you keep the same CPU for 5 years) for 0-20fps in games that already run far above 60fps.

>When did this meme become about Intel? I fully recall it being AMD processors heating up like all shit.
Are you too young to remember the P4 or something?

Nah, I don't run my Intel CPU at 100% 12 hours a day. AMD cpus can't hit 240fps whereas Intel can

>needing more than 60fps in games
Lel

>defending a bad experience because you're poor
Lel

>games

So uh
This is why my computers have been combusting?
Just had one spit flame through the rear fan

Okay now this is epic

hey guys

Attached: me.jpg (960x576, 50K)

>bad experience
Required water cooling, high bills/temperature, dozens of CPU bugs and literal hardware backdoors are worse experience. If I wanted a shitty gaming device I'd buy stadia. Intel lost half their performance with meltdown and spectre fixes alone.
You can always overclock a fucking Ryzen if you want your shitty 10fps.

my 9900k can run comfortably for daily internet browsing and youtube use when underclocked to 800MHz and 0.5 volts

>Required water cooling
Don't have or need it
> high bills/temperature
Like I said before, $10 more annually is fine for a better experience imo
>dozens of CPU bugs and literal hardware backdoors are worse experience
Never had a problem with any AMD or Intel CPU in the past 20 years.

You're clearly indoctrinated on a shitty experience, you might be the target audience for Stadia.

ayymd makes housefires gpus these days, their cpus are actually good

This, even their locked CPUs have exceedingly high real world power consumption. And OFC there's jew cum in the IHS so you'll get 80-90C under load even with a water cooler.

((((((((((65w tdp)))))))))) MY ASS

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9RLzgvNzIxNTIwL29yaWdpbmFsLzA1LVBvd2VyLUNvbnN1bXB0aW9uLVRvcnR1 (708x532, 117K)

>Never had a problem with any AMD or Intel CPU in the past 20 years.
Hahahahahahaha, kys.

>believing in Intel propaganda
Intel has always been the fucking buggy, overheating, overpriced, spyware processor.

Imagine getting so upset that someone isn't incompetent enough to experience unusual issues like yourself that you would wish death upon them lmao can't relate

And by buggy did you mean pozzed? Because that's definitely true.

When did this meme become about AMD? I fully recall it being Nvidia graphic processors heating up like all shit.

Attached: 1418913900221.jpg (1034x689, 574K)

Spectre, meltdown and VISA affect all Intel CPUs you fucking idiot.

that's a lie. vega 64 oc to 1712mhz, 65º rock solid stable. Sapphire Nitro. Just ramp the fans and don't be a faggot. It goes up to 70ºC on long gaming hours. I'm pretty sure that's on par or even below nvidia anything in terms of heat generation. Yeah, the fans are loud, but as I said, don't be a faggot or downclock if your sensitive ears make your anus tingle.

Tell me about it
Going as well as tencent

Attached: 20190416_121049~2.jpg (3117x2369, 1.6M)

I actually should retest that oc. it's the same profile from 2 windows installs ago and upwards of 10 driver versions ago...
On a related note, I never had a single problem with radeon express updates installs with this card. Before I had to make sure to disable corssfire and oc because they always fucked up driver updates. Now, it's a one click update and maybe I'm being just lucky but the only necessary action, other than that one click, are one reboot and to reapply the OC profile after it's done.

>he doesn't know
technologyreview.com/s/612989/chips-may-be-inherently-vulnerable-to-spectre-and-meltdown-attacks/
"Fucking idiot" indeed lmao

To be fair most UV pretty well, yeah AMD fucked up by over-volting them in the first place but isn't near-RTX 2080 (~$800) performance for $300 in some games a pretty good deal?

newegg.com/Product/Product.aspx?Item=N82E16814930006

Attached: Screenshot_2019-04-04-14-49-51.png (1280x720, 321K)

Pentium 4 was the original housefire.

>all articles paid for by intel.
desperation has set in.

I wish Intel goes like Nokia after the iphone. They deserve to rot in mediocrity after these shit practices.

2080ti 77ºC gaming load according to Tom's Hardware.
So, I'm comfortably below that.
My idles are higher,though, they keep oscillating from low 40's to high 40's, 100% fans @ 60ºC, haven't double checked in a while but hwmonitor, hwinfo and wattman used to pretty consistent and similar in my gpu temp readings.

before you bitch, nvidia's idles are 35ºC for the 2080 and 37º for the 2080ti, also from the same th's article.

Attached: Untitleuhghfd.png (1050x763, 283K)

>The truth is paid by Intel
>AMD just tries to deflect and mask it
HMM really makes ya think

>the guys behind the "Ryzenfall™" "exploit"

You deserve the holocaust, Jewfag.

>you're a jew if you acknowledge the truth instead of fighting it for your favorite company
Good goy!

now say that without haggis-stuffed dick on your mouth.

For real though intel has dome some pretty shady shit to maintain that 80% CPU monopoly. Remember that 1+ $billion anti-trust settlement?

>"Intel's restraints
According to AMD, Intel will refrain from these practices:

• Offering inducements to customers in exchange for their agreement to buy all of their microprocessor needs from Intel, whether on a geographic, market segment, or any other basis

• Offering inducements to customers in exchange for their agreement to limit or delay their purchase of microprocessors from AMD, whether on a geographic, market segment, or any other basis

• Offering inducements to customers in exchange for their agreement to limit their engagement with AMD or their promotion or distribution of products containing AMD microprocessors, whether on a geographic, channel, market segment, or any other basis

• Offering inducements to customers in exchange for their agreement to abstain from or delay their participation in AMD product launches, announcements, advertising, or other promotional activities

• Offering inducements to customers or others to delay or forebear in the development or release of computer systems or platforms containing AMD microprocessors, whether on a geographic, market segment, or any other basis

• Offering inducements to retailers or distributors to limit or delay their purchase or distribution of computer systems or platforms containing AMD microprocessors, whether on a geographic, market segment, or any other basis

• Withholding any benefit or threatening retaliation against anyone for their refusal to enter into a prohibited arrangement such as the ones listed above.

cnet.com/news/intel-to-pay-amd-1-25-billion-in-antitrust-settlement/

the truth...
the only video for vega review I ever watched.
>hi guys, sip beer
>today we gonna be reviewing, sip beer, vega 64
>video montage of installing the card, in the same workbench crapped up with nvidia drivers ever since the gtx 480's. while he sips beer
>now we gonan check temps. oh no, can't go all the way to 100%, it's too much. So, sip beer, I only feel comfy with a 50% fan curve. Also let's see how high we can on the OC since i have this, sip beer, shit opened. I don't wanna come back here ever again.
>runs furmark
>omg, 90ºC, yyikes. cracks another one open, pour it all over his gifted amd card and call it a watercooling. hehe, eh. sips beer.
>so, nvidia is awesome. sips beer.

It's inevitable.

this is the dumbest meme ever, anything ever clocked to 5ghz is going to have insane heat density that's just physics
if you downclock intel chips to amd level they have much better perf/watt

and shows 3 games/benchmark
pubg
arma
project cars

>if you downclock intel chips to amd level they have much better perf/watt
during the Bulldozer era maybe. Zen+ actually has advantages due to how it's SMT implantation is better than Intel's

Not really, zen+ achieved ~5% higher IPC than intel's coffinlake architecture and scales for significantly less power at the same frequency.

Also their 3700U 4-core picasso APU consunes a maximum of 15 watts at 2.3 GHz base frequency and their 3700H one consumes a maximum of 35 watts while maintaining higher than 3.0 GHz clock frequency on a full-core load. Intel has NOTHING to compete with that.

en.wikichip.org/wiki/amd/cores/picasso

gpu's also.
One of adored's video shed some light into all of this, not the main topic he was discussing, but for brevity sake, AMD has very decent efficiency up to a point, but they keep going well past it to remain competitive. He was using stilt's efficiency curve graphs.

sure
>zen+ achieved ~5% higher IPC
HAHAHAHAH [CITATION NEEDED]
you faggots lie so much it's unreal
first it was 7-10% behind, then 5-7% behind, then 2% behind, then even, and now according to amd shills they are actually AHEAD in IPC right now?? unreal I've never seen this claim before it's just so outrageous

Attached: oh no.png (1339x957, 72K)

they stopped using wooden screws.

Attached: 1345776693003.jpg (311x1047, 72K)

>that would be
>vulnerability one
>patch
>...
>vulnerability 49
>patch
you lost both ipc and hyperthreading somewhere inbetween patch 1 and 49

>system power consumption
A+ for effort

Attached: 22_power-cinebench-nt.png (849x803, 89K)

are you at least getting paid to lie?
it would be truly pathetic if you were shilling for free

see
I used to own intel systems desu but getting SSD performance after all the fucking software mitigations made me drop intel desu senpai.

post power consumption

is this graph supposed to support my point? look at the stock 8700k in that graph compared to the 2700x, now imagine if you downclocked the 8700k to match

Best case scenario, he won the silly cone lottery and hit near 150W power consumption even with the 1500MHz OC. In most cases you can hit at least the turbo stock 1400MHz around 1.1-1.2V which bets you 200W or lower power consumption.

YMMV ofc but don't expect pic related

Attached: TjGZVOpzpLfa1bgsHkZvwCBX9MxjeESciugjgWx51vE.png (1016x622, 32K)

are you at least getting paid to lie?
it would be truly pathetic if you were shilling for free

I am getting paid but I didn't say anything that is not true

>Buying a 9900K for web browsing

sounds like fermi 2.0

how the fuck you do that?

Buy Intel

>Productivity doesn't matter
>muh 5-10fps with a 2080Ti

Attached: 1362671107409.jpg (1000x993, 605K)

based

This, the fx9590 makes alot of heat but it can be cooled on air.

delid

Sitting comfy with muh haswell. But anything newer than 2017 AMD all the way.

Anything after sandy bridge was a scam

checked

Six years old and still a badass.

FX chips in general are still pretty badass, but the 9590 and 8370 are really cool because theyre a better bin, should clock about 100mhz faster than my 8350 at the same stable voltage.

>9900K in a laptop
Holy fuck user,does it even hold it's base clock stable?