Does TDP even mean anything anymore?

Does TDP even mean anything anymore?

Attached: AMD-Ryzen-9-3900X-and-Ryzen-7-3700X-CPU-Review_Power.png (950x217, 94K)

Other urls found in this thread:

theswissbay.ch/pdf/Gentoomen Library/Extra/Richard_P._Feynman-Feynman_Lectures_on_Computation__-Addison-Wesley(1996).pdf#page=153
twitter.com/NSFWRedditVideo

>i9-9700K
>Ryzen 9 3700X
what retard made this spreadsheet

What's wrong with it

also that's total system power consumption but your point is still valid it's obvious the new high end ryzens draw 150-200w range

Intel i3 9900K
Ryzen 9 3200G
What's wrong with it

TDP was always about heat (thermal design power.. duh). I got no idea why people though that it has something to do with power consumption.

What pajeet took these measurements?
The 2700x on the torture loop in tom's "we advertise intel here" hardware the 2700x topped at 215watts.

Attached: 1528280358688.png (616x455, 146K)

>the second law of thermodynamics is a lie

They have to have 0% energy efficiency for heat to match the consumption tho. TDP is totally unreliable measurement when it comes to power consumption due to difference in process and arch.

I guess you can make a very generalized assumption that if the TDP is lower, power consumption will be lower for chips of the same generation at full load (since you're wasting less energy on heat)
But to actually equate this to a numerical value of power consumption: yeah, pretty useless

Did it ever mean anything wrt to power draw?

>8 cores consuming more power than 12 cores
wew intel

OK, OP.
I was gonna go from 2700x to 3800x since it's same tdp, but I'm not sure the VRMs on my x370 will like that so I'm downgrading to a 6 cores 3600x.

I expected 3700x to draw slightly less power.

I expected much less.
In fact, I expected 3800x to line up with 2700x.
Explains a lot about x570 motherboards, really.

Germans

TDP is a metric for how much heat load the cooler needs to handle. TDP and electric power use are related, but different.

For example:

Chip A uses average 250 watts.
Chip A needs to run at 65C - 80C causes enough damage to the transistors over a 5 year period that the chip fails, so I don't want Chip A to go beyond 65C
The heatsink needs to draw away X watts to maintain T temp
My chip hits 80C if the cooler only draws away 175W of thermal heat, so I rate the cooling for 225w TDP and the chip reaches ~65C

Chip B uses 250W too, but is designed to handle 85C without damage
The heatsink needs to draw away 185W of thermal power for the chip to run at 85C,

So the processor's power can be the same, but depending on what the chip requires to be stable without damage, TDP rating changes.

Energy efficiency of what? The actual 'useful' energy that's needed to carry out a computation is completely negligible compared to the consumption of today's processors and it's still ultimately heat.

theswissbay.ch/pdf/Gentoomen Library/Extra/Richard_P._Feynman-Feynman_Lectures_on_Computation__-Addison-Wesley(1996).pdf#page=153

>Cinebench R15
>2700x 153w
>3700x 225w
uh?
If that's the case, 2700x is much more power efficient.

Full system measurements?

No, user, we're talking about little platters of silicon. The closest you are ever going to get to a difference in readings would be if one chip was made out of a significantly greater amount of silicon.

Unless you think architecture somehow affects physics and chemistry.

You know, the more I look at this thing, and the more it looks fake and gay.
We'll see on sunday.

They're literally like 1% efficient

For fuck's sake.
A cpu is a resistor.
U = R.I
P = R.I2

You want to minimize I to not hurt your silicon.

I'm just trying to use more simple terms to describe the relationship between electric power use and thermal power removal. People don't care and dont really need to know the maths for calculating resistance and transfer.

Simply put: A chip runs at X temp using Y watts, the heatsink needs to dissipate Z thermal watts to maintain X temp.

>My chip hits 80C if the cooler only draws away 175W of thermal heat, so I rate the cooling for 225w TDP and the chip reaches ~65C

That's not how it works. A medium's capacity for heat dissipation changes as it changes temperatures, which is why they're able to achieve equilibrium at some temperature at all. I'd you didnt dissipate all the waste heat then it would just heat up forever like a resistor heater. The reason why it stops at 80c is because at that point the CPU is transferring enough heat to the heatsink that the heatsink reaches a temperature at which it can dissipate all the heat. Thermal conductivity is not a direct relationship to dissipation. I'm not sure why newcomers to PC building find this so difficult to understand. This is something the enthusiast/watercooling community has espoused for 2 decades but we still get memes like MUH ALL COPPER COOLERS. The only point at which thermal conductivity matters is to saturate the heatsink as quickly as possible but it's bottlenecked by the IHS anyway which is why some overclocked delid even soldered dies

Again, I was just being simple.

A 90C device is losing heat to a 20C environment faster than a 70C device is to a 30C environment, I know. Thanks.

Thermonuclear Destroyal Probability

Literally house fire tier.

>amd has high tdp
yes

>intel has high tdp
no

This all looks fake and gay.
Some power figures don't make sense (2700x cinebench 153W)
And even the CPU naming is wrong.
Also, Germans.

No. Every vendor measures it differently so you can't directly compare the numbers, and Intel cheats so much it's useless as an approximation too.

For the brainlets
Intel can't into dissipating heat
AMD can into dissipating heat

It's a system benchmark, built around different CPUs. The CPUs are just a component of the numbers, not the whole number.

This. Intel's 9900k needs a 300w cooler to operate over 4.7 jiggahurts

But no.
I know for a fact my CPU alone (2700x) draws 140W on cinebench.

There are like 4 different ways to measure tdp now

The "old standard" is how much TDP you need for the chip to stay at operating temp
The "new Intel standard" is the "average chip power without full AVX load"
Another standard is the "total card power" or "typical board power"
And another is the "typical graphics power" similar to Intel's "average power but not at full load"

Most people don't run their computers out in the open in Alaska. The chip+heatsink+room commonly used is one system and you'll have to dissipate whatever the chip runs at eventually. This turbo meme boost garbage that lets the chips draw whatever it wants until throttling is incredibly deceiving.

Mine draws 130W. It's mostly a compute task while the others either use gpy for gaming or hardware acceleration. Which is why the figures are scary. My system cant handle a 200W CPU

But why would 3700x draw 225w at the same test?
Fake and Gay.

Maybe it's AVX2

Not on CB R15.

Maybe zen 2 is shit

But we had the CB demo of an 8 core going toe to toe with a 9900k at this same test.
And total system consumption was even less than a 2700x alone.

That was r20. In any case they're at or higher than the 2700x power level in all of the tests and the 3900x is even higher, and they're close to the 9900k seems strange they would fuck up all of their tests like that. Something screwy is going on.

The goddamn OP picture says R15
8 core demo was R15.

Those numbers are pretty strange and they don't make much sense to me, like the 3700X and 2700X.

With game tests it might make sense for a 3700X system to use more power than a 2700X system because the games will run faster - therefore the graphics card will draw more power on average, as such the whole system might end up drawing more. The numbers are however pretty close between 3700X and 2700X despite the GPU almost certainly drawing more, which would mean that 3700X in a game probably uses less than the 2700X.

The CPU test results are however contradicting. The game results suggest 3700X would likely be more efficient, but CPU tests show the exact opposite. How the fuck does that work? Something strange is happening, maybe the 3700X has much more aggressive boost or something or there was something wrong with their 3700X setup, like there was something wrong with GPU drivers and it didn't idle properly or some shit going on in conjunction with the new chipset, assuming this was even tested on X570.

Yeah, as I said, it's fake and Homosexual.