OH NO NO NO NO NO NO NO NO NO NO NO

OH NO NO NO NO NO NO NO NO NO NO NO

AHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

Attached: performance-per-watt_1920-1080.png (500x810, 44K)

Other urls found in this thread:

usingenglish.com/reference/idioms/dead even.html
twitter.com/NSFWRedditGif

And it could have been a perfect card but NVDIA had to gimp overclocking on it.

If only it had more than 6GB :^)

I was always curious how much that equated in wattage. Through a weird course of events I ended up with an OEM RX570 that just happens to be dead even with my 1060 3GB, in performance, except for the fact it runs 80*C idle compared to my 1060 3GB's 35-40*C. idle.

You cant get more performance for free goyin you must pay for it

80°C in idle? Did you put a fucking sock on it?

My 1060 idles 35*C in my case just fine. Go cry to AMD.

lol who the fuck cares about how much power a card uses

my 1060 6gb runs 60°C idle, according to my computer censors.

Let's say you play an average of 15 hours per day and use the graphic card for 6 years with an energy price of 0.5€/kWh then the 100W extra of the RX 590 will lead to an extra energy cost of
100W * 15h/d * 365.25 d/y * 6y * 0.5€/kWh = 1643.63€

You're literally burning that money away with AMD.

>15 hours per day
>365.25 d/y
>0.5€/kWh

Attached: 1502493866196.gif (480x287, 1.21M)

Who uses a GPU for 6 years I've had 4 in 6 years

Who tf plays 15 hours every day? This isn't /v/ wtf is wrong with you people

>Let's say you play an average of 15 hours per day
nigga what

lmao

based and alternative factpilled

try 12h a day of pc being powered on. and half of these are gaming so the other half is just idling. thats not even 300 bucks. also, gpu stays at best relevant for 3years, not 6

> gpu stays at best relevant for 3years, not 6
290X 4lyfe. Gave it away to my friend, still competitive @ say, 1060 3GB.

>50 cents per kWh
do you live on some random Caribbean nation that has to import diesel to power small generators randomly scattered around the place or something?

>Let's say you play an average of 15 hours per day
I'm not a Korean BW player.

Attached: 1523380432959.jpg (6400x4075, 3.16M)

it's almost as if amd are made for poor people

Attached: ayymd.jpg (209x241, 9K)

cringe and angelpilled

> that just happens to be dead
> it runs 80*C idle
I'm sorry sir, I do not understand what does that mean. Could you kindly explain how a videocard could be dead and run at the same time?

Serious question, how would a "Performance per Dollar" graph look?

Serious question, how would a "Performance per Dollar per watt" graph look?

it depends on your case air flow

That performance per Dollar chart is useless since it completely ignores the electricty cost.

Attached: performance-per-dollar_1920-1080.png (500x850, 50K)

usingenglish.com/reference/idioms/dead even.html

Only a NEET who's never paid a bill in their life would believe that running any graphics card over another would have an effect on your electricity bill that'd come even close to offsetting the cost of a cheaper card. In before dishonest calculations based on using the card at 100% load for 24 hours a day.

It's not about electricity savings. Gpu archs are limited by power, so efficiency directly affects peak performance if you normalize for power. 300W on amd's arch (i.e. VII) will not be anywhere near 300W on nvidia's arch (i.e. 2080ti)

>intel are housefire xddddddd
>but its okay when its amd gpus

Yeah anyone who hand waves away AMD's shitty efficiency is missing the point. They will never be competitive until they can stop putting out housefire tier cards.

i still use my old 670, but that's because my 1070's dead and i painted it white so i can't rma it

When will we get newer GPUs?
Some not gimped by RTX meme and with more VRAM. Basically the 30xx series

>1643.63€
That's more than my total yearly electric bill.

My i9 laughs at those numbers.

2020, 7nm GPUs

Exactly 2 years from RTX launch in 2018

naturally, performance also doesn't matter if intel is winning
but if amd is winning then suddenly BTFO!

OUNOONONONOUNOUNUU-AHAHAHA!

HOW WE WILL RECOVER?!

IT'S SUPPOSED TO BE OUR YEAR

AHAHAHA OH NO NO NO NO AHAHAH NOOOOOOO AHA!

WE ARE ON SUICIDE WATCH AND BTFO'ED!

NO HAHA

> when I try to roleplay as Pajeet, I actually get kind and relevant answers
Thanks for the link, I could've guessed it from the context.
You're thinking OLAP cube.

>AMD 7nm
>50% the efficiency of 12nm
Can't make this shit up

Power isn't just about electricity cost, It's also heat, 500W thrown into your PC case, and room, requires a lot of cooling which means more noise.
You also need to invest in a high end PSU.

>this calculation with retarded assumptions of playtime a day for 6 years is more expensive than my 1 year electric bill
Imagine being this stupid/illiterate

To be fair game devs are fucking mentally and physically handicapped.

Attached: radeon_VII.jpg (1920x1080, 186K)

>single game with specific anti alliasing technique
>cherry picked as it gets

The point is game devs are lazy niggers who outta be flogged in a public square. Sure CMAA is less demanding but that still doesn't explain why it curb stomps a 2080ti.

good thing you went with a 1060.

Attached: vegalaunched.jpg (1920x1280, 271K)

I literally bought a 1050ti because it uses less power and therefore outputs less heat. I live in a hot place, air conditioning when it is 40C outside and having a 400W heater near your legs is a nightmare.

N-nooo, but vega!

Attached: Vega RISE UP.jpg (4928x3264, 1.08M)

Per watt aint shit unless ur minig, who gives a fart

Isn't CMAA originally developed by Intel?

Some countries have retarded energy politics and have ended up with ridiculous electricity prices. Not all of us live in fucking Canada.

>80C idle
Nigga even my R9 290 doesn't get higher than 40 on idle and higher than 76 on load

Why the FUCK would anyone buy that card when you could just get a 2060

How?

Yeah well my 1060 idles at 240°C. Why would i lie on the internet? I'm not a shill.

>just because it uses lots of power it's all dissipated as heat

A computer doesn't do any useful thermodynamic work. It's all dissipated as heat.

Who buys products with a primary concern being 'performance per watt'? Nobody. People buy to meet their graphic settings demands. Budget is the next factor, then you have looks, power usage and heat.

Literally nobody says "I chose my gpu because it had a good 'performance per watt'.

1080ti user here so you can't AMD/nVidia me

The price

And GPU excel when AA is lightly used or noAA is used. Pure hardware power pushes AMD usage. Where does AMD begin to fail? Games with high CPU overheads, aka nvidia baked PhysiX games like GTAV (they claim not to use it) but if you run GTAV with dedicated physiX card, it performs much better. And Unreal Engine 4 games, basically its an engine developed to market nVdia cards. NVidia and Epic partnership is decade long relationship in works.

What's the best budget GPU for 1440p gaming?

I used a GTX 260 for 8 years.

>Who uses a GPU for 6 years

I had my 8800GT for 8 years before upgrading. I've spend less than $1000 on GPUs in my while life and I'm a 30yo boomer.

>I know absolutely nothing about how graphics are programmed or what GPUs actually do
>can only read misleading performance graphs
>calls other people retarded

Vega56 or 1070ti.

Maybe try going into AMD/afterburner and manually set the fans. My 390 would idle at 73+ because the fans wouldn't come on until it hit 80 for some odd reason. Still idled at buttfuck temps afterwards though.

PLS SIR THE RX 580 HAS FREE GAME JUST DO THE NEEDFUL ALREADY

Either you're lying out of your ass (wouldn't put it past an Nvidiot) or your GPU is broke.

Attached: 1543470392062.jpg (379x50, 16K)

>you play an average of 15 hours per day
>play an average of 15 hours a day
What the fuck is wrong with you? Not even full time NEETs can average 15 hours of gaming every day
>energy price €0.5/kwh
Where the fuck do you live?

your pc is broken

Attached: Clipboard01.jpg (400x71, 7K)

>Power draw doesn't matter!

rtx 2060

I have rx580 8gb overclocked, and with fans at 0% AND 30°C ambient AND 1080p60 fullscreen it stays at 51°C. Change paste and clean coolers

who gives a fuck about power consumption

The Radeon Cope™

Intel fanboys did until they started releasing 300W housefires. Nvidia fans do as long as you don't mention the 2080 or 2080 Ti, because they're the exceptions and allowed to pull 300-350W.

0.5 euros per kWh? Where the fuck do they pay so much, here in Greece where we have some of the most overpriced electricity in europe it's 0.09.

>because they're the exceptions
For the performance offered, the power draw of Nvidia high-end is perfectly reasonable.
Compared to what you get from AMD alternatives which are universally awful, they're positively sipping it.
The Radeon VII for example gets just over half the power efficiency of a 2080, and offers *less* performance.

>some of the most overpriced electricity
More like one of the cheapest. You should look at what Fukushima did to Germany's energy price.

Attached: Power-price.jpg (427x292, 25K)

I feel that. I live in Brisbane, and my mini-ITX home server, my gaming PC, and my gfs are all in the same room. You can feel the temperature rise by several degrees when you step into it.

france stronk
we need new reactors tho

lmao germcucks. Someone has to pay for Ahmed and his 12 children

What I don't get is, how come games that are made with consoles as a priority (AMD polaris GPU) are still somehow favoring nvidia GPUs? do they use a different engine for PC?

Real calculation here
Buy a gpu who cost 100€ less than an nvidia one but consume 100w more
0.15€khw/10 =0.015
100wh=0.015
Play 6 hours day
20 days per month
0,015*6*20*12=21,6€ per year more
21,6*3=64.8€
If you change gpu every 3 years you are still better of with an amd gpu

Yeah i cant go over midrange parts because its so fucking hot here

Radeon VII would be great if it was like $500.
Now it's just a late 1080 Ti alternative.

Attached: 4f5.jpg (1000x714, 182K)

>10% performance uplift over 1060
>priced $220
>tfw bought GTX 1060 @ $160 new on sale

Skip this generation and move on to the next I guess. I don't want a side grade.

>15 hours a day gaming 100% load everyday
kek

More like 1 hour everyday on average.

an average gamer spends six hours per week with gaming, there is not extra energy cost between mid-october and mid-march, no gamer would use a card for more than two years, so 0.1kW * 6h/week * 105 week * 0.2 euro/kWh = 12.6 euros is a much better estimation for a gamer.

>there is no extra energy cost between mid-october and mid-march
I forgot about this, including this the correct amount wound be around 7 euros.