Is there any reason to get a nvidia gfx card over an amd one when amd has an affordable line of cards that make nvidia...

is there any reason to get a nvidia gfx card over an amd one when amd has an affordable line of cards that make nvidia overpriced in comparison?

Attached: nvey-goy.jpg (2637x1992, 1.01M)

Other urls found in this thread:

gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-AMD-RX-580/3639vs3923
youtu.be/Gz7rfuQZBAc?t=589
steamcommunity.com/sharedfiles/filedetails/?id=358044987
twitter.com/NSFWRedditGif

If you have a shit PSU, or have faith in the future of leather jacket man's underhanded tactics. Otherwise, no.

Nvidia actually works.

it also makes you work, by having to afford those pricey memecards

if you care about gaming performance

Sneed

dem sweet GEFORCE leds

literally no reason whatsoever. went with an rx580 instead of 1660 and it was the right choice

AMD HOUSEFIRE BTFO

cant make money mining with an rx580 because of the high power usage

>DgVoodoo (For pre-2005 DX1-8 games, the latest releases are broken on AMD)
>OpenGL actually performs halfway decent
>DX11 games run WAY better
>Low power consumption
>Metro Exodus
>Now has Freesync
>Far lower driver overhead for basically everything (A Godsend for 60HZ+ gaming)

Unless you play the 2 games optimized for Radeon GPUs, Nvidia IS the superior choice. Before, AMD had Freesync up its sleeve but now Nvidia finally realized they can't force every single display manufacturer to add 100+ modules and added it for their GPUs as well (Albeit only for GTX 1000 and up).

The 570 and 580 are GOOD cards, but they demand Sandy Bridge IPC MINIMUM to run games at steady framerates and eat up twice the power because AMD can't be bothered to tweak them for any sort of efficiency. If Vega doesn't deliver Radeon might actually become irrelevant in the consumer market. They can't keep selling 4 (technically 7 if you look at ALL of GCN) year old tech forever.

Attached: nvidia-g-sync-monitor-stack-comparison.png (3765x1896, 155K)

your choice:
>rx 570
4gb
130$ -
fine wine
add 20-30$ for 8gb model
>gtx 1060
3gb
180$ +
gimped gayming drivers
add 20-30$ for 6gb model

Not so hard to make the right choice

Attached: 1424464499784.jpg (872x960, 101K)

I'm using a 2600x and have steady framerates in every game I play. Although I do play at 1080p. Could give less of a fuck about power draw, I don't notice any difference in heat or my electricity bill after upgrading from a 7700k and 1050ti. Sounds like a whole bunch of sour grapes to me

>Nvidia IS the superior choice
>they demand Sandy Bridge IPC MINIMUM
gee, I wonder who can be behind this post

Attached: j.jpg (380x479, 45K)

gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-AMD-RX-580/3639vs3923

imagine paying extra money for the same or less performance. Nvidiots will tell you MUH HEAT while burning a fat hole in their bank accounts for lesser hardware

Try playing older games, especially at 144Hz on Radeon...you'll start tearing your hair out, and that's IF they actually boot (Thanks Crimson!). Also, 2600X is a good CPU but Windows 7 support on Ryzen (many games won't run on 10, or require extra steps) is shaky.

Pic related, Max Payne 2 running on an FX 8300 + RX 460...on a 144Hz monitor. The framerates for many older games fucking collapse on Radeon (+ mediocre CPUs) whenever there's even a hint of action

Attached: MP2.jpg (1920x1080, 322K)

>2600 + rx570
>csgo stable over 500fps
also
>fx
nice try shlomo

Attached: 1458597381222.jpg (300x300, 21K)

>eay

Attached: confused jeb.png (662x800, 443K)

untill you remove any gameworks then they work as a literal housefire trying to keep up

If you arent running a 240hz screen or a 4k screen just get an rx580...bottom line.

And power efficiency which is just horrible for AMD

actually, i have an RX570 and I use DgVooDoo to play mechwarrior 3 and it works fucking fine, so, I don't know what you're talking about really.
Driver overhead is a meme, btw, Shadowplay/Geforce experience always used more cpu/ram than relive does.

b-b-b-but ...power...c-consumption...

Attached: 1447539610477.jpg (250x279, 16K)

CUDA

>FX
>144Hz
Pick one ad only one.

This is the only real answer, but there are no developers on Jow Forums (/v/*)

This rule only works to an extent though. I would agree that nvidia would be the worst choice every time if AMD had something to compete with say the 2080Ti. If you're looking for the best absolute performance then Nvidia still wins.

>intel is a housefire
>muh ryzen power efficiency!!111
but when it's about AMD GPUs then suddendly
>Efficiency doesn't matter!
Kek, those AMDrones never change...

I don't care so much about extra cost to electrity, but the fucking heat you have to deal with from the card and the coil whine on vega cards absolutely kills it for me as I have a quiet build and it's so fucking annoying to hearing high pitched whine that modulates all the time.

for 1 week, then it brakes

the only non-overpriced amd cards are the rx570 and rx580, everything else is overpriced

>the RX 570 and RX 580 require Sandy Bridge IPC minimum
True, but who in the world would pair either of those GPUs with a CPU older than Sandy Bridge?

Based truth spitter.

>I-It's horrible!!!
>reality: 180W vs 200W cards

Having a GPU that doesn't eat 411w just to not even reach 2080, let alone 2080Ti performance.

Seems to matter to you a whole lot when you're talking about Intel CPUs.

AMD hypocrisy never ceases

They run at a reasonable temperature.

Exactly. With CPUs, power consumption is irrelevant, but it's the most important thing when choosing a GPU. When will they get it right?

reality: 120W (GTX1660) vs 180W (RX580)
guess which runs 10 degrees hotter

Only if you're rich and want the absolute best performance or if you care about the power efficiency meme

i get free games from nvidia sometimes
my card came with BFV and i just got shadow of mordor for a driver update.

>10 degrees
Woah.
That's like 1/7th of the temp difference between intel and amd cpus.

>or if you care about the power efficiency meme
Power efficiency is just a meme. Truly great thinkers we have here on Jow Forums.

Only the 570 has noticeably better price/performance.

Attached: u3CGDfH.png (577x336, 12K)

>B-B-BUT MUH CPU!11 *cries in Indian*
nice cope

If you are not a NEET the price difference between NVIDIA and AMD is so minor that the fact NVIDIA just works makes it totally worth it over AMD.

Is that actually measured by the same external thermometer for both cards? You can't trust what the GPU itself reports because the temperature of the GPU varies in different parts and different loads. If the built-in temperature sensor is near a hot spot, it's going to report higher temps. You also don't know how the temperature sensor is calibrated. Two different sensors can report wildly different temps if they are not calibrated against the same reference source.

Freesync is broken on novideo and leatherman doesn't want to properly support it because they want to sell goysink.
rtx memes actually make games look worse
novideo drivers are unstable and have much more problems.

If you have self respect and want superior experience, buy amd. If you build small-factor machine for high performance, you have no choice but to go with novideo. But you will have to tinker with unstable drivers and spend time researching the market and picking a display that will somehow work properly with novideo.

You are reaching the levels of bullshit that shouldn't be possible. GCN handles old games much better than maxwell+.
>Windows 7 support on Ryzen is shaky
Stop fucking lying on the internets, retard.

>Freesync is broken on novideo
Not really, most monitors work fine, just not all of them.

The other way, most displays don't work.

And your proof on this? Because I have looked into this a decent amount and what I found is that most work, with some having issues.

what

Attached: 1558492626782.png (852x2000, 132K)

youtu.be/Gz7rfuQZBAc?t=589

Attached: 221.jpg (489x797, 86K)

I mean your graph has no info to it and shows completely different results. Id rather believe these guys because theyre exactly showing that theyve used the same system for all the gpu testing.

Also 2060 only using 164w Lol. did they read the values out of gpu-z?

>I mean your graph has no info to it
Which info would you like?
>Id rather believe these guys because theyre exactly showing that theyve used the same system for all the gpu testing.
Who are "these guys"?

>Also 2060 only using 164w Lol. did they read the values out of gpu-z?
That's actually what the 2060 is rated at, 160W. Don't confuse it with vega architecture which is garbage in terms of efficiency.

Attached: 1555882405155.png (2074x1763, 301K)

CUDA

Seriously why isn't opencl more common in the 3d world?

Notebookcheck. Btw the Crysis graph disproofs yourself with 336w and 228w in the other test with the X in the background. idk man they dont really show a coherent argument. some dont feature brand names, while every single one of them wont show the gpu or psu used in the testing.

cpu used*

Youre also missing average rates, max or min rates in most of these graphs.
Every single one of the graphs that wont show the game or games tested could be falsified by being cpu-necked

steamcommunity.com/sharedfiles/filedetails/?id=358044987

You try that for the Dgvoodoo issues user?
Please report back.

>Every single one of the graphs that wont show the game or games tested could be falsified by being cpu-necked
Sure, but if you are willing to believe those sites will lie about what cpu was used, then why can't they just straight up lie about the numbers?

i dont think some of those were falsified on purpose. I believe theyre unknowingly falsified or have parameters i cant see. They could test starcraft 2, csgo or other games with that and ofc the values would shift completely. The mid end cards would look like they use way too much power for what kind of performance theyre offering and the highend would look like theyre super efficient.

Fair enough, but we are talking about POWER CONSUMPTION.
Not video game performance?
You want to tell me they heavily skewed the results based on special games they used? That only made AMD cards look worse but not nvidia?

>Forgot pic related.

they could be cpu necked like here for example. There are more games which have a fps cap at 300 or 200fps, some of these titles are very mainstream like OW, CSGO, rocketleague..

Attached: 5.png (862x585, 54K)

I like the software that comes with nvidia like shadowplay and ansel. Using the cuda toolkit for tensorflow is a lot faster too.

But my next card might be amd because they might be implementing driver-level integer scaling in the next year so 1080p content isnt blurred when it mathmatically should be "perfectly" scaled to my 4k monitor, nvidia on the other hand has a 80 page forum thread that is still being updated, with a spokesman who said it wasnt on the radar. Also every monitor I own has freesync support unintentionally. The sapphire cards also have removable fans that dont void warranty which sounds nice because I had to rma my zotac 1070ti for bad fans a few times.

Attached: f0f237aac4e170f7d792eb0faf61747c4579242b_00.gif (320x180, 490K)

>You want to tell me they heavily skewed the results based on special games they used? That only made AMD cards look worse but not nvidia?
but they dont. In my picture it at least shows how oc+ isnt worth it and how the 2060 z gaming is almost equal to the vega 56 oc edition in terms of efficiency.

>In my picture it at least shows how oc+ isnt worth it and how the 2060 z gaming is almost equal to the vega 56 oc edition in terms of efficiency.
? Can you please look up 5-10 different sources around the internet? I get the feeling you are experiencing some cherrypicking here

i feel like youre trying to look up stuff that somehow proofs your view on things while completely disregarding the quality or consistency of your sources.
I dont need to look up 5-10 sources(lol), i got my well documented one already.

>I feel you are trying to get proof by viewing many different sources on the matter, this is wrong
>I have a single source, I trust it completely, why would I need to look up viewpoints on the subject from other people, I already like the conclusion I found
....Yeah, if that's how you see things I am not going to assume this discussion will bear any fruit. Feel free to believe what you want, as you clearly are deadset on keeping your view on the matter as limited as possible.

Depending on the game, it can work on Nvidia. But it NEVER does on Radeon.

DGVoodoo 2.55 is broken on AMD. 2.54 is last working version (though in some cases it's 2.53) and that one's 2 years old at this point. Either the dev doesn't give a fuck or doesn't want to fix his app for AMD, and given that it's needed to play some older games on modern OSs, all the average person can do is stay on an ancient version or use NVidia

undervolt it retard

efficiency only matters when it comes to cpus you gay nigger

I'm sure if we all push the dev with emails and requests to fix it, he will succumb.

AMD is good at low end but has no high-end
R VII is not worth getting over 2080, there is not a single rational reason for it

>no high-end
>vega 56 isnt a high end card you can buy for 269€ + 3 games bundle

Attached: 56.png (1014x656, 80K)

because their mid range GPUs perform better than similarly priced amd gpus.

midrange card that loses to 2070 almost everywhere but its cheap now yeah

hows the 12/13th fastest card on the market which is capable of 60fps 4k on high not a high-end graphics card?

Nvidia has a monopoly on the machine learning market. Which is what I'm doing my thesis in, so I'm stuck with nvidia. Well, for training that is. Inference is usually done on CPU, and Intel has the market cornered there.
I want off this ride goddammit. I'll take poos over jews any day.

Performance

HOLY SHIT, I almost bought vega 64 over 2060, thanks for this, almost made a mistake.

>vega 56 slower than 1660ti, ok

>when amd has an affordable line of cards that make nvidia overpriced in comparison?
Considering the new RX isn't even priced competively with the RTX AMD just signed a suicide note

but it's nvidia buyers fault amd priced them like that!

I think Nvidia's software in general is better to deal with

but it probably doesn't justify the price difference

What is the best card for 144hz AAA titles on 1080p.For Nvidia and its equal for AMD?

Vega 56 is $300 on newegg and the 2060 is about $340 if the $40 makes a difference.

2080ti

1060/rx580
$200+ range maxes 1080p

>1050ti is $50 cheaper than RX560
>Performs better
Why would I touch substandard overpriced amd hardware.

Yes. Retard.

Raytracing/machine learning/rendering also makes a difference.
Not having coil whine and having the card be great out of the box also makes a difference.
Paying less for power yearly also makes a difference after a while.

Would you reccomened a 1060 or 580

1660 is better card in every way
Performance
Thermals
Noise

jannie probably banned me

But they don't? There's no AMD card as good as the 2060 for the price.

Vega 56

Kinda unrelated but now that windows update 1903 is pushing out the latest DCH nvidia drivers, does anyone know if these drivers provided by microsoft have all the same telemetry and bloat as the drivers from nvidia's site?