Decide to leave computer on for the night

>Decide to leave computer on for the night
>Wake up in the morning
>Bedroom is noticeably warmer than the rest of the house
Thanks AMD

Attached: 15586632.jpg (1600x1600, 120K)

>keeps you warm at night
cozy

>poor insulation in room
>switch over to windows, boot up ridiculous rainmeter skin
>Room warms up in less than an hour
Based AMD

I mean, it uses very low power on idle but the card still maintains a comfy 30°C that gets thrown out of the case inside the room

Let me tell you a history about an architecture named Fermi and the wood screws.
tl;dr - Many pcs melted.

>30c is hot
wat

Sounds like a good winter card

Found the nigger

>what is delta over idle
>what is actual net heat output instead of just temp
Protip: a chip sipping power with a temp just a few degrees over ambient isn't going to significantly contribute to warming your room. Your body is putting out a magnitude more heat.

Like ten years ago I had an overclocked i7 920 and three GTX 260s in triple SLI. I actually could heat my room with it, furmark + prime95 used like 1100 watts.

>ok when nvidia does it

Attached: higherthanvega.png (822x399, 964K)

>30°C
Give me that in F and we can talk

That's 6.6 C lower than normal human body temperature : ^ )

topkek

Novideo subhumans are always the same
They shut their gay mouth until the gtx 600 serirs was released

Your core temp is a function of your cooling solution. What really matters is power draw, because 99% of that power draw is being put out as heat. If you're drawing 260W (typical for a Vega 64), you're putting out about 250W of heat.

but where is the 10w going?

To move the fan.

that ends up as heat anyway
apart from the sound escaping the room and if you have the window open the fan "wind" going outside -pretty much nothing
the most you might actually lose is with ebin bright LEDs

16 Watts is exactly 1/100 or 1% of standard room heater power. So stop LARPing bro.

Attached: dialog4_rlt.png (1280x960, 524K)

Mine only uses 150 watts