Why do AMD fanboys like burning their money?

I was almost buying an RX 580 when I just found out that it has the same performance as the Geforce 1060, but 2X THE POWER CONSUMPTION.

In what universe is that acceptable? I mean, it can't be possible that ALL AMD fans life with their mommies and don't have to pay their own utility bills.

Attached: giphy.gif (500x273, 500K)

Other urls found in this thread:

tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html
tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-6.html
twitter.com/SFWRedditGifs

My 580 nitro+ consumes 180W under use. 30W while resting.
I think the 1060 is at 120W-150W so not quite.

nice personal anecdote. don't forget to grab your AMD gift cards by the end of the month

tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html

tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-6.html

My vega is consuming 4w at idle. What is wrong with your 580?

"resting" that's cute user, do you wear programming socks?

Many AMD cards are known to be overvolted from factory for some reason.

No idea.
I don't English well.

stop pretending that you pay bills OP
your mom pays that shit lil nigga

AMDrones are literal sheep

they have better drivers, bought a 1080 and their drivers are shit compared to amd

undervolt it, my 580 uses like 120-130W under load with the same performance as before

The funny thing about "MUH POWER CONSUMPTION!!!!" threads is that they always reveal the person making them as a NEET faggot who's never paid a bill in their life and has no idea how much electricity costs, or how much common household appliances use.

Even if your post were correct and AMD cards would consume twice the power of their competition, one is still better off buying red and avoiding the jew. One also gets better drivers for both Windows and Linux, the latter's are free and thus give you a no-fuss automatic install.
The number you get in GPU-Z and the like reports core-only power usage. It even states that in the tooltip. The whole card does use more.

My 480 uses about 8-10W at idle with stock bios and around 30W with 580 bios. The temperatures are the same. 580 misreports power draw for some reason.

I wasn't looking at GPU-Z, but wattman, but it does the same. Glad to know, thanks.

Srsly funny how much the anti amd spam has gone up on Jow Forums and reddit the closer we get to CES.

If you knew what's good for you, you'd be fighting tooth and nail for revival of only other GPU manufacturer. As you don't, you go around begging everyone to shoot themselves in the foot.

Future GPU monopoly with prices for idiots? No thanks.

Attached: I_Like_a_Little_Competition.jpg (501x670, 33K)

Wow a whole 2$ more to operate over 5 years, really breaks my bank

>GPU monopoly with prices for idiots
have you seen RTX prices? It's already happening

amdgpu-pci-0b00
Adapter: PCI adapter
vddgfx: +0.85 V
fan1: 1635 RPM
temp1: +32.0°C (crit = +94.0°C, hyst = -273.1°C)
power1: 9.01 W (cap = 48.00 W)

Seems fine to me?

>1600rpm at idle
why

Please tell me how to have this. I'll post it in a couple of hours when I get home

Nvidia using less power than AMD? I call shenanigans. Typically Nvidia eats up a fuck load of power for the same performance.

?

Attached: Radeon-RX-Vega-56-Efficiency.png (1016x622, 32K)

It's not really at 'idle' I've got 3 web browsers open and watching ABCNews24.

You're on Linux?
It's just the output from sensors, since 4.16? maybe 4.17 AMDGPU has had power consumption added for Poolaris. I'm using 4.18 (I really should compile for 4.20BlazeIt)

not everyone lives in a country that dedicates all the money that would otherwise be spent on free healthcare and colleges on an army that goes around the world stealing everyone's oil so that you have cheap electricity

tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html

tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-6.html

now play a new game and compare against the competitor

see the references in

still, my 480 Nitro spins at similar RPM playing GTA online

Considering that Intel literally cant get 10nm to work right, and nvidia shot themselves in the foot rushing RTX before the GTX line was ready to accompany it at the midrange, this is all they can do.
All hail Ryzen 3, all hail stock 5GHz clocks

Blame MSI for their shit coolers.
The card was cheap.

I don't use this card for gaming. It's for desktop usage.
A 1060 probably equal to my 290X in performance and will be using probably 4x less power to do it.
But a 1060 didn't exist 4 years ago.

Every company from every domain will charge more than their competitor if they deliver the best performance. Until AMD can match a 2080 Ti or a Titan RTX, stop complaining about prices.

No I am on winshit.
The other guy is right about the fans tho.
I believe it is fanless under 55°C.
Max fan speed stock is like 2200rpm and over 3000 max reachable speed. Then it sounds like a plane.

Does your card have any problems? My vega won't downclock its memory in Linux and sometimes in Windows. I never found the solution.

I just have an OEM AMD card, but I'd spend a bit more money for an AMD card because their open source driver on linux is really damn good.

I'd only use an nvidia graphics card if I /needed/ cuda.

>Game
Kys kiddo

Linux might've just been old AMDGPU without reclocking?
Windows - who knows, keep in mind if you're using the video decoder at all it'll ramp your mem clocks to max.

>I was almost buying an RX 580 when I just found out that it has the same performance as the Geforce 1060, but 2X THE POWER CONSUMPTION.

The TDP of the 590 is ~25 watts lower than the GTX 770 it's replacing on my end, and that's under load.

Attached: D2iTbtWb.jpg (400x400, 33K)

Considering that my DC Inverter AC drops down to 170W after it reaches temperature modern common household appliances don't use that much power, infact the only things that use more power than my AC is my hair dryer which 1200W under full tilt and my microwave which uses 900W on high settings while my AC only takes 208W full tilt.

My water uses natural gas, my stove uses natural gas, my washing machine uses 0.0062KWh/kg, my fridge uses 288KWh per year.
Heck my computer hits 500W full load making it the most used power hungry thing in the house.

tfw no restful gpu

running damage control already, leather jacket man? whats wrong?

>saphire rx 480 nitro+
>15W idle
im fine senpai

>Geforce
THEY make lower GFX settings in drivers

Was funny going from a 390x oc that used 4x what my 1080 uses to a Vega 56 that used less and back again.
Polaris is garbage it peaked with the 480 and the xbox one x gpu
Wait for navi or 2060 prices to drop

>not everyone lives in the first world
yeah i know, but the majority of consumers do.

I won't even get into charts and graphs from reviewers who measured power draw. What I will ask is what shithole country do you live in that power is such a concern? My most expensive power bill I've had was during the summer with 2 windows unit ACs running, my server, myain PC, countless lights, etc. And I paid $70 that month. My average bill is $30 a month.

Not even an AMD card owner. But
>Much power usage
Is such a stupid fucking argument. if that's your concern, why are you even looking at dedicated GPUs? Shouldn't you be trying to game on iGPU?

>Polaris is garbage it peaked with the 480
this
480 is the only polaris gpu worth getting
also maybe 570 with current pricing

It is technically a concern due to cooling (and making your room hotter if your area gets hot). It's more of a concern for CPUs, because it's usually more difficult to cool them, but GPUs with reference coolers are painful to listen. Thankfully, custom cooler designs help a lot and I've no complaints about that for my Vega.

Of course it's happening. Retards want to see those halo products and AMD's R&D department just cannot do it.

Also because AMD can't fix Open GL drivers. Their cards can should have no problem but for some reason AMD can't code good Open GL drivers.

i've had an r9 390 for 3 years and it still holds up also i live with my parents so power consumption doesn't matter

Amd processor 2013,nivdia graphics card

Attached: download.jpg (474x355, 14K)

I got an XFX RX580 for $190 from Amazon, did i fug up

Attached: stt.jpg (789x789, 97K)

>also i live with my parents so power consumption doesn't matter
lel

>2X THE POWER CONSUMPTION

post proofs

oh wait you cant i forgot

You can't go wrong with a 580.

everybody is a sheep

The 590 undervolt super well too. I was able to undervolt mine by 50mv without any hitch, now I have a card that's cooler and runs 3-4% faster than stock

all GPUs are a decision of when you want to pay extra. Do you want to pay it up front, or do you want to pay it in the tiny increase to your electric bill every month. The difference is about the same.

>power consumption doesn't matter

Attached: amdfanboy1235.png (653x726, 91K)

>tomshardware.com/reviews/amd-radeon-rx-580-review,5020-6.html
>tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-6.html

It doesn't have twice the power consumption, why are you lying?

This is without taking into account undervolting or Radeon chill

undervolting can actually increase performance while dropping the power usage.

Attached: RX 580.png (909x340, 35K)

it didnt matter the last two decades

Depends. If you got it recently, then yes. Because you can get one from miners below $100.

The cheapest 1060 6 GB: 214,20 €
The cheapest 580 8 GB: 180,59 €
35 euros. How much time is required to reach a break even with these 35 euros?

>see vids of something called RX 590 popping up on youtube
>check it out out of curiosity
>its yet ANOTHER 480 rebrand

I fucking hate AMD so fucking much. 7970 was the only good product they ever released

>getting used cards
>from miners
m8

RX590 is a process shrink so it actually is more efficient power/performance wise
Of course it ships with retarded voltages and clocks so it requires tweaking those

theres a reason they are dirt cheap from miners bro, that shit will last like a year at most

Nice reddit gif, my friend.

There are rumors that boomer just deleted all of his incoming mail at the end of every week, including unread emails. Was fired just recently.

Attached: 101969-mike-rayfield-formal.jpg (750x422, 78K)

The reason is miners want to pull out and make some money from selling the hardware. Nothing else.
If hardware was defective, it fails in first months. If it lasted a year when mining, it might as well last another 5 years gaming.

you can easily spot the reddits by the unfunny wojak or facebook frog though