"I pick AMD b-because I don't want to be J-Jewed by nVidia for additional $50."

>"I pick AMD b-because I don't want to be J-Jewed by nVidia for additional $50."
>Pays twice as much for power consumption yearly as nVidia user.

Attached: 1455357803605.jpg (796x805, 115K)

Other urls found in this thread:

purplemath.com/modules/orderops.htm
techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_Founders_Edition/31.html
videocardz.com/newz/nvidia-to-certify-adaptive-sync-monitors-to-support-g-sync
twitter.com/SFWRedditImages

LOL AMDrones BTFO and didn't get near this thread.

>Pays twice as much for power consumption yearly as nVidia user.
Because our GPUs are constantly running at 100% 24/7, right?

AMD GPUs also have huge power consumption rates at idle compared to N.

fake edit: As a matter of fact, the ratio is even higher with idle for AMD:NVIDIA than in 100% usage.

Everyone that uses amd lives with their mom so they don't have to pay for power. nvidia BTFO

B-BUT MUH GSYNC TAX REEEEEEEEE

>twice as much for power consumption
I thought Jow Forums Anons were good at math

I dont buy jewvidia out of principle. I could pay 2x for the same perfomance and will still buy amd.
Jewvidia is a shit, anti consumer company that will do everything to fuck gullible goys in ass and reach a monopoly on gpu market.

>Makes sure the ONLY competitor is dead because of $20 of electricity.

kys

Show us your nose.

Damn, you're such losers you'd literally bet on a dead horse to support competition.

Who is a loser here, you are giving money to people who actively do everything to fuck you over because they know they can. The ultimate cuck.

Damn a whole 5 dollars a year more I won't be able to afford that guac and chips at chipotle or a bottle of beer at a bar.

Fuck me over how? You sure you're not just a seething AMDrone posting anti-Nvidia comments on Jow Forums for 5 rupees? AMD's top of the line cards cost more and perform worse than RTX1060.
>inb4 muh G-Sync
Fuck off, freetard.

>nvidia
>actively hamstrings their opencl implementation to push CUDA
>open source nouveau driver is shit

>amd
>contributes to open source amdgpu driver
>has open source vulkan support

Attached: 1447274520553.jpg (728x690, 120K)

1060 makes my VR launcher drop frames, but RX 580 is smooth as silk

Ok FINE i'll take the bait if only because I know you're going to post this exact same thread every day for a a month.

A vega 56 under load compared to a 1070 consumes 78 more watts. (measured by total system consumption) so in an hour of gaming the difference is .078kWh. Lets say your average Jow Forums NEET spends 8 hours a day gaming that means that every day the vega user consumes .624kWh more than the 1070 user. In a year this ads up to 227.76 kWh. The average price of energy in the US is 12 cents per kWh. So this means our Jow Forums neet vega user pays $27.3 more per year than the 1070 user. Do what you will with this information. (Idle power consumption was within 5 watts not something I'm terribly worried about as I power down my system when not in use.)

>no argument but muh AYYMD pays lip service to open soars
Like anyone cares except the particular foot-eating sect.

Attached: 1546789814501.png (398x468, 53K)

I'd need a card to idle at 200W 24/7 to pay an extra $50/year for electricity

thank you user

And how much is that for, say, 4hrs of gaming per day for both AMD and Nvidia?

Only rednecks, racists, & Nazis use Nvidia. Enlightened gamers choose AMD.

Attached: Normal Day in London.jpg (748x499, 397K)

>doesn't know what healthy market looks like
>actively roots for monopoly

seriously user, kys

If you don't run Linux, you have no say in it. Let actual users decide.

>inb4 muh G-Sync
>Fuck off, freetard
No, you fuck off you braindead moron. Look at the amount of freesync and gsync monitors on the market. Jewvidia is single handeadly fucking up monitors market just because those smelly jews refuse to support open technology (freesync) despite the fact they can. And brainded imbeciles like you eat this shit up and buy anything they release. Also lets not forget about telemetry built into drivers they use to make some extra money on the side selling data of losers like you.

You poor fucker

cheapest vega 56 i can see on newegg canada is $624. cheapest 1070 is $484

based on your math, the breakeven point is 5 years for the power consumption difference. after that, you should be being a new card because the vega 56 will be costing you more now.

> cheapest vega 56 i can see on newegg canada is $624
Well, there was a 56 on CU for 310 euros, Gigabyte OC something. Now it's 350 for a different card before taxes. Seems like Newegg jews you.

best card coming through

Attached: best card.png (1975x1175, 298K)

I pick AMD because I'm a Linux user.

Attached: Screenshot_2019-01-07_21-08-32.png (1920x1080, 1.95M)

Jow Forums anons are too stupid for math, that's why they studied computer "science". not me though, I am a math patrician

>being a total brainlet and not undervolting his amd gpu to get 30% less power consumption
can't do that on your jewvidia.

what's that? I can't hear you over the sound of my 2080ti

a math patrician?

solve this, then

2 + 3 * 2 = ?

>inb4 some american calls me a retard
8

yes, i didn't buy into rtx meme garbage. maybe next year when performance with this fake ray tracing bollocks isn't dogshit. thanks for beta testing. :D

>= ?
get out of here with that gay shit

Attached: 1535145245272.jpg (568x445, 27K)

Actually, they support freesync now. But brand them as Gsync compatible. And only 4 freesync monitors out of hundreds.

2 + 3 = 5
5 * 2 = 10

blow it out your ass, math "patrician"

why would I buy nvidia when their competing models are slower and more expensive at around the same price points than amd

your card sucks poorfag
just buy nvidia
it just works

Wow, if you do it wrong you get the wrong answer.

i did it on my phone you nigger. you is saying you smarter than my $800 samsung?

>Also lets not forget about telemetry built into drivers they use to make some extra money on the side selling data of losers like you
Loosen up your tinfoil hat, gnucuck.

post that from your iphone xs max did you. get the fuck out.

>nvidiots can't into order of operations
colour me surprised

I got an RX 580 to go with a Ryzen 7 system I built last year, and I'm really regretting it because I don't actually play games.
Is this pretty easy to do?

Guess so american purplemath.com/modules/orderops.htm

>AMD GPUs also have huge power consumption rates at idle compared to N.
Nice fake news you've got there. Hoping nobody would bother to check?

techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_Founders_Edition/31.html

Attached: power_idle.png (500x890, 42K)

You literally, just this very moment, have a task called nvtmmon which is short for nvidia telemetry monitor in your task scheduler, you lukewarm iq moron. This isnt even tinfoil, they are 100% open about mining their """ customers""" data.

*12 monitors, sorry.
Also, they still will be using Gsync branding instead of Async or VRR lmao.

videocardz.com/newz/nvidia-to-certify-adaptive-sync-monitors-to-support-g-sync

Farmer Stallman owns 3 sheep he buys another 2 if all of farmer Stallmans sheep give birth to one lamb how many sheep will farmer Stallman own?

Farmer Stallman owns 3 sheep if they all give birth to 1 lamb and then he buys another 2 sheep how many sheep will farmer Stallman own?

Attached: Radeon-RX-Vega-56-Efficiency.png (1016x622, 32K)

Add it to the firewall's black list if you're that afraid that Nvidia is going to know about your ergo collection, you drooling faggot. The process is being loaded with drivers, but it won't do anything if you don't launch GeForce Experience . Fuck off and die.

BODMAS

>twice as much for yearly gpu load power consumption*
so 10 bucks

No, instead I will buy product from a company that respects me as a human being. I know this is too hard to comprehend for a loser like you who failed at everything in life. Some people have self respect.

>he does it for free

And that's what the inb4 was for

Okay, Rajeev. Here's your 10 rupees.

I'm glad there's no stupid Jow Forums user.

You forgot the important part though:
>Takes about 5 years for that extra power consumption to equal $50

Nice samefag loser. I didnt even bother to mention how quickly you moved goalpost when you realized my claim about telemetry was true. This was too easy.

>caring about power consumption costs

That might have been an issue if you had a room full of cards for crypto mining. That bubble burst and Jensen is so desperate to save his own ass I bet he payed extra to have his conference before AMD.

Face it green team is in trouble.

Nvidia's opencl implementation is actually better than AMD's. Hell even Intel's was. AMD only recently released opencl 2.0 compliant drivers and I'm pretty sure their compiler is still fucked up, which is why they were pushing HIP.

I'm not sure where this bullshit that AMD is some great ally of open source comes from. All they have is their driver stack on linux and like a few dx11 samples. Intel unironically contributes like 10x to FOSS what AMD and to more agnostic projects. Its not like AMD has better opengl or opencl drivers too, up until recently they were a shit show and are still broken on windows

>7W more on idle
That shit will literally make you go bankrupt

I use AMD simply because they support freesync

Turn off that incandescent bulb, it is consuming the equivalent power to 10 AMD cards.

>All they have is their driver stack on linux
That alone puts them miles ahead of Nvidia. They're currently the only viable choice for discrete GPUs.

>he doesn't know
AAHAHAHAHAHAAHAH
OH NO NO NO NO NO NO

> up until recently they were a shit show and are still broken on windows
Can say the same about Intel tbqh. Their opencl drivers are still and will be broken on 8.1, there's a warning hashcat throws.
+ Intel makes proprietary drivers for their network cards, LAN and WLAN. Can't judge about the rest, though.

you have to do it like this to bait someone
2-3*2=?

Intel has an entire department, OTC, just to work on the linux kernel. They're at least the biggest for profit corporate contributor iirc, maybe the biggest entity overall unless you aggregate others. I think they're also big donors to FOSS.

I'm sure AMD contributes but I've rarely seen special mention of how much. Most of AMD's contributions are just for their own technology and it's very limited in scope. The only agnostic contribution I can think of is giving mantle over to khronos. But mantle was co-developed with EA DICE's Johan Andersson who has gone on record saying he did the heavy lifting and was the one trying to shop around for OEM support.

There's three threads up right now about Radeon power consumption, two days before their 7nm CES presentation, when performance was typically what people complained about before CES. I think that's really interesting.

Attached: it's time.gif (500x500, 3.95M)

Anytime AMD is about to do something the level of shitposting increases by orders of magnitude. Remember: according to Jow Forums in late 2016 ryzen was flatout impossible to actually be giving the metrics AMD were showcasing.

Attached: hmmm.png (470x454, 11K)

/thread

>

Attached: 1544404296735.gif (290x218, 1.03M)

*gimps your card after 12 months*

heh, nuthin personnel goym, just upgrade to the latest series :^)

Attached: 1504823864507.png (1064x698, 337K)

I legit confirmed an actual paid shill for Intel back in 2015.

Nvidia feeling the pressure. Round one is this softball bait.

based

all these shills payed by int(c)el trying to undermine AMD because the 2060 is average at best and intel just dug their own grave by showing literally n o t h i n g at their keynote

save us mamma su

Attached: XZY4267.jpg (336x195, 17K)

So many ngreedia cope threads today, I see the 2060 is a disappointment.