Be Intel

>Be Intel
>95W TDP
>Motherboard makers cheating on MCE
>Housefires everywhere
>'Hey! We only give out guidelines to motherboard manufacturers. It has NOTHING to do with making our CPU's look better than AMD's'
youtube.com/watch?v=IBrumDWpl-c

Attached: tdp shennanigans.jpg (2560x1440, 296K)

Attached: lol clockrates.jpg (2560x1440, 258K)

What the fuck intel. It's like a fucking FX-9590.

What the fuck Intel meanwhile my 2700x sips power and sticks at 4ghz+ no matter what

What the fuck Intel? Meanwhile, my AMD cpu crashes my productivity software

mommy

Intel shills have been reduced to posting nothing but evidence free unfalsifiable FUD because all the empirical evidence is now against them.

Attached: npcmeme.jpg (1400x1400, 211K)

Attached: crashes premier.png (816x173, 12K)

Oy vey!
DELIT THIS NOW!

Attached: 1.png (872x768, 218K)

Attached: chrome_2018-11-07_18-59-11[1].png (770x1098, 91K)

Yours is smaller.

I'm actually confused now because everyone is getting different results and i didn't fudge anything in the original

You have to use a proxy and not be logged in to the botnet to get "unfiltered" results.

Attached: BTFO.png (788x362, 25K)

Using a proxy, not logged in:

Intel
About 1.450.000 results (0,52 seconds)

AMD
About 652.000 results (0,50 seconds)

Go figure
"adobe" "intel" "crash" gives less results than
"intel" "adobe" "crash" which gives less results than
"intel" "crash" "adobe"

This really shouldn't surprise anyone who's been paying attention. Intel and motherboard manufacturers have stretched the official spec wider than goatse.

i9-9900K is for all intents and purposes a 150 watt part.

its surprising tho because its a 95w TDP part that uses more than 95w and needs more than a 95w TDP to keep it from frying so what is the point of even claiming at 95w TDP in the first place

Nice shoop

So they can lie about power efficiency. When it's obviously a space heater.

obviously this, jewish trickery

95w obviously looks better.

But what is worse is that it's so inconsistent that it'll fuck over a lot of consumers.
>9900k stock in reviews gets 200fps
>ok looks good, ill buy a midrange mobo since im not OCing
>my 9900k gets 150fps

>9900k spec says 95w tdp
>buy heatsink for 120w with high end mobo
>95c

The same will happen to prebuilts a lot, especially smaller ones. You never know how your 9900k will perform until you research the mobo very thoroughly.

so they're lying about performance, power consumption and thermals, what a lovely company

who cares, its all about the gaymes baby

considering the difference in market share this image doesnt look good for amd.

>empirical evidence
You mean how they made shit even faster even if it was cheating?

I'd take that over unstable no drivers and slightly slower scenarios any day.

>cpu
>drivers

Attached: 1435768036468.gif (480x327, 1.76M)

How do you get away with posting on Jow Forums while being that tech illiterate?

Kek, story of his life. Don't be so mean user. At least tell him how big and powerful cpu he has

quit samefagging

Eat a dick

Attached: Screenshot_20181107-210633.png (1440x2560, 305K)

>Steve literally saying that this is a motherboard maker issue and that Intel's version of TDP is different compared to AMD

Literally AMDrone spin fake news

Nice shop

>Nice shop
put your trip back on faggot i'm trying to filter you

>losing your trip just to damage control

nvidia does this with literally every gpu to look better than amd. they always use more than what the tdp is, that's just a basic guideline

AMD uses more than their TDP on the GPUs too user

TDP isn't power consumption, though it is related. A simplistic example are the piledriver chips - their TDP is so high at basically all clocks precisely because the silicon doesn't like going over 60c. Intel's chips for the longest time will go all the way to 95c which means they can get away with a lower TDP.

If you had a magical cpu that couild heat all the way to (say) 200c without damage you could get TDP way, way down as you'd obviously need less cooling performance to keep said magical cpu within nominal operating temperature.

Yep it's all the mobo maker's fault, certianly not poor innocent Intel and their guidelines that cripple the CPU's performance with a wink and nudge in their direction.

>Intel's version of TDP is bullshit compared to AMD

Fixed.

Every time I see your trip I wonder why I don't filter you, then I remember that it's hilarious watching you punch yourself in the dick and that is worth more than not having to see your shitty posts.

>40 posts
>23 posters

Go samefag somewhere else.

lol go 愛してるさわこ !BE/4F4GG07 somewhere else.

It's true they lied to consumers and told board manufacturers a totally different spec to look better. It's fraud.

i9 9900K tubo clock is actually 4.3, but they conspired with board makers to run their product 'out of spec' and get 4.7 average to look good.

In reality the 9900K is no better than an 8700K or 9700K.

I like how in every benchmark it runs 5ghz for 4 seconds then throttles down to 4.2ghz for the rest of the duration.

Meanwhile my 2600X happily sits at 4.0ghz~4.2ghz indefinitely, there is literally no way to make this thing throttle to base clock.

nothing to see here

Attached: intel shill.png (1000x500, 415K)

>when your CPU is jelly of VRM temps
>95c
>fine

Attached: F-35Explosion-735x413.jpg (735x413, 31K)

Is adobe blameless in this? Most adobe software aren't even well multi-threaded. A better benchmark of stability would be 3d applications where all cored on the cpus are stressed at 100% for hours to render results.

Temperature tells you nothing. I've had AMD processors that ran at 55 degrees at full load making an unbearable noise and I've had Intel processors at 70 degrees running whisper quiet.

no its entirely adobes fault that shit crashes on everything since forever but retarded people are trying to make it seem like it only happens on ryzen cpu's

shave your head hippy

>temperature tells you nothing
>fan speed is the real indication

Attached: 268zj7.jpg (645x729, 26K)

A CPU that is designed to run at 70 degrees will be quieter than a CPU that is not designed to run at 70 degrees at that same temperature. What part of that do you have trouble understanding?

My Darkrock TF is quieter than the stock cooler while running at the same temperature your logic is fundamentally flawed.

Everyone can see from all the AMD threads you keep posting 24/7 that you have severe mental issues but with each post you keep pushing the limits.

>present facts to support your argument
>call'em a faggot √

Google's search results are totally screwed up no matter what you do which is why it's so utterly useless. Not sure why anyone uses that garbage anymore. It was great the first years, it's been a decade since it was usable for anything. The results are filtered and tailored no matter what you do, if you're not logged in and have no cookies it'll still present filtered "customized" results based on your IP or subnet or region. You can use a "proxy" to get somewhat "unfiltered" results but they really aren't, they are just filtered based on that proxy's location - which will apply equally to everyone using that proxy but differently depending on what proxy you use. See how different searx instances get varying results when they pull Google.

As for search for "adobe crash".. that debate is just stupid. An application that crashes because it's buggy will be buggy no matter what CPU you use.

>microcode: microcode updated early to revision 0x2b, date = 2018-03-22
>microcode: Microcode Update Driver: v2.2.
While it's not something we typically think about or notice it's actually a real thing.

how fast does the fan spin on your intel heater's stock cooler?

>what is the point of even claiming at 95w TDP in the first place
I don't really like regulation but muh TDP is an area where I really do wish some regulatory body like the fascist union (EU) or some US agency would step in and demand that CPUs and GPUs release some clear factual numbers along with "TDP" such as
-idle power draw,
-sustained load power draw and
-idle heat output and
-heat output under load
those numbers would actually tell us something. right now TDP is some guideline where you can probably safely assume that one rated 65W will use less power than one rated 95W and that's about it.

How you configured the fan curves on your personal systems in the various machines you've had doesn't say much about anything, perhaps it says something about what "auto" settings on your motherboards were but that's not very meaningful data. As a totally meaningless data-point: My Ryzen 2600 under sustained full all-core load runs at 62C at a reasonable noise-level and it stays at 61-62C under full load lasting hours or days. I can make that number be 55C or 70C by adjusting the fan speed. This says nothing about other people's cases/heatsinks/fans.

In any case, all those temperature charts that you post here all day are utterly pointless and I don't understand why you insist so much on trying to push this meme. I've never heard anyone complain about their CPU being too hot, but complaints about noise levels are very common from people with badly designed CPUs (AMD)

Tbh with the difference in how many people use intel and AMD this seems bad for AMD

I've never posted a temperature chart here ever, you're probably thinking of someone else. And I don't need a chart to know your statement is just dumb, noise is a result of the heatsink size or lack of there of and the fan.

Please do tell me all about the noise-level on my old A8-7600 APU which is now used by a nephew. I put a 140W heatsink on that 65W TDP APU. Fan started spinning slowly after about 20 minutes of compiling, most of the time it was off. I'm guessing you can do the same with a similarly or lower TDP Intel CPU.

>I have never heard people complain about their cpu being too hot.

Just pretend this never happened and and the whole deliding trend are fake news.
Lets also ignore that large chunk of the intel market constantly throttling at 1.8ghz apple products

Attached: Untitled.jpg (615x494, 74K)

Attached: ParallelLoneCottonmouth-small.gif (640x341, 1.7M)

Is there anyone who will complain that their 9900K runs at 5GHz all cores? I guess not.

For 5 seconds

Attached: calp.gif (300x224, 928K)

Beat me to it

hey look someone who has some intelligence

They even managed to fuck up solder TIM.
Has anyone desoldered a 2700x and used LM to see if there are any thermal improvements? Im genuinely curios

>Has anyone desoldered a 2700x and used LM to see if there are any thermal improvements? Im genuinely curios
Der8auer has done this on AM4 and older soldered Intel CPUs. There is an improvement with liquid metal, but it's closer to a 5°C improvement give or take compared to the 10-20°C that can be seen on the ™Lake CPUs.

4.7Ghz is only out if spec if a motherboard decided to ignore turbo time limits and run the CPU indefinitely at that clock rate. and the actual turbo boost is not “4.3ghz” that is just what most 9900ks will find itself after the turbo time limit is up and it needs to stay at 95w but at the same time still tries to give you the best clocks

>only out of spec when you violate the spec
yes, that's obvious. And I don't have a problem with violating the spec if they are honest about it and clearly document what default / auto settings actually do.

the situation with the latest Intel motherboards seem somewhat problematic in that regard, it's not clear what default and auto settings do and some of the boards (MSI Godlike) will even apply settings that are different from what is entered. If you enter 100 and 108 is applied to make the board appear to be faster then that's a defect and a quality control issue..

Attached: ryzen1600x-temps.jpg (1724x1661, 709K)

>If you enter 100 and 108 is applied
i've had this issue with multiple gigashit motherboards on amd and intel

Your 2700x sips power because it’s not running at 5ghz.
Only people having issues with a 9900K being hot are the morons that think they can run the i9 at 5ghz all core and somehow expect it to get the same temps as a 8700k never mind the 33% extra cores. For some reason people thought an i9 9900k was some magical chip
An i9 9900k is just the limit that most air/AIO coolers can handle

>i9 9900k
It literally needs a chiller to cool down.

Is the i9 9900k special in this regard?
Take a look at other intel 8 cores at 5Ghz and tell me if those fuckers don’t need a chiller, if you look you might actually find the i9 9900k isn’t all that bad comparatively.
>but my ryzen runs cool
Clock a 9900k to the level of a ryzen and guess what it’ll run pretty cool

As long as they keep supporting all of my favorite game's events I will stay loyal to Intel.

>issue with multiple gigashit motherboards
It's really hard to notice and verify, it's probably more common than we'd expect. It probably doesn't help that the tech press really isn't doing it's job, specially not those on YouTube.

I searched for some stuff on YouTube and clicked some clickbait motherboard review the other day. Some guy in his mom's basement did a motherboard benchmark test, he used stock settings "because that's what you get when you buy the board" and did CPU benchmark tests to show "which motherboard is better". That kind of garbage "press" is what motivates motherboard vendors to use out of spec overclocked default and auto settings.

>Clock a 9900k to the level of a ryzen and guess what it’ll run pretty cool

sources seem to indicate otherwise by a 10c~15c margin

Attached: oc_prime.png (725x728, 361K)

Motherboard makers kinda have to do something in order to justify a $600 motherboard over a $150 one, they can’t stuff like PLX or 10Gb nics because that’s useless to the “gaming” market so instead they overclock the base clock so they can be at the top in the hundreds of useless motherboard benchmarks published every chipset launch because that will get more suckers to buy there overpriced shit than actual features ever will

This is the only picture I can get showing a i9 9900k at a sane clock speed
4.7Ghz 67C with 146W power draw
Again the 9900k only runs hot because people make it run hot

Attached: Screenshot.png (3525x1894, 2.55M)

>Again the 9900k only runs hot because people make it run hot
By expecting the kind of performance Intel quotes? Sure thing, Linus.

That pic is literally the performance Intel quotes and it runs decently cool. Intel quotes 4.7ghz all core and that is what is there. Intel didn't quote 5ghz all core when you turn MCE on or set it that way, they only said 5ghz for 2 cores which wouldn't make the chip run anywhere near hot

what? Novideo have lower power consumption than rayydeon, there's no lying required.

Failing so hard. Go read a book brainlet

Attached: Screenshot_20181108-072028.png (1440x2560, 305K)

>Again the 9900k only runs hot because people make it run hot
XAAXAXAXAX how retarded one can be

processors make noise these days?
lol

>Intel's chips for the longest time will go all the way to 95c which means they can get away with a lower TDP.
Yeah except the 9900k can crash during benchmarks even with coolers rated for 200w TDP. So what the fuck does TDP mean?

>BASED INTEL 9900K RAPES AIDS ridden PozZEN 2700(X) [email protected] 1.4v, even without overclock at lower TDP
Seething ayymdrones on suicide watch.

Attached: ______.png (1366x768, 508K)

This is why a 2700X has similar IPC to an 8700K. When Zen 2 3700X is launched@ stock 4.5Ghz it will easily match or beat a 9900K at '5Ghz' while using half the power.

A driver sits in device manager or the equivalent. What you are referring to is an OS patch.

See

Every time you add more cores on Intel 14nm it requires more power and thus generates more heat. You cannot overcome the laws of thermodynamics. The 9900K runs hot because it uses more power than the 8700K. Even if only 2 cores are at 5Ghz there are 6 additional cores running at a lower clock speed and they still require power and generate additional heat in the process. Intel 14nm+++++++ was NEVER intended to run 8 cores that high and it is ONLY in order to keep the performance crown away from AMD.
Meanwhile AMD designed their CPU to be more than 4 cores from the ground up. It is why the AMD 8 core variant runs cooler and ueses less power at a lower clock speed. The uArch is designed to give similar IPC as Intel at lower total cost of ownership. It is also why AMD need 7nm to get the clock speeds up on multicore CPU's whilst lowering the power (and thus heat) requirement.
AMD have known for years that in order to compete with Intel they would need to improve the design first and were aware they would not beat Intels best offerings. But they also knew that future iterations with a node shrink would match or beat them.
Unless Intel pulls a rabbit out of its ass they are fucked for the next two years at least.

>time to render = TDP
The absolute state of Intel cucks.

Attached: 1541162910747.jpg (645x729, 42K)

>hurr durr AMD CPU BAD!
>hurr durr Intel CPU good!

Attached: d27.png (645x729, 75K)

even results for amd are displayed faster than inlel

ORANGE LOGO BAD