Is Vega 56 actually good or am I falling for a meme?

Is Vega 56 actually good or am I falling for a meme?

Attached: amd-vega-64-inhandmem1-4000x2670.jpg (4000x2670, 754K)

Other urls found in this thread:

amazon.com/Seasonic-SS-520FL2-Fanless-Platinum-ATX12V/dp/B009VV56TO
youtube.com/watch?v=0PftkOaKfik
tomshardware.com/reviews/radeon-rx-vega-64-water-cooling,5177-3.html
youtube.com/watch?v=4JiJud00IsE&t=685s
twitter.com/SFWRedditGifs

It's a good GPU for what it is.
Not the best mind you but worth the money if you can snag it for 400 or under.

Everything depends of the price user

This

I'm getting a vega 56 op, I'd say go for it if it's cheap. Comparable or even better than a 1070 in some cases but only worth it if the price is right.

Question for Jow Forums: I'm planning on switching to linux with my new pc, what distro should I get that will ensure I have working drivers with the aforementioned vega 56?

You're falling for a meme.

Depends on usage.
For the most part, AMD cards put out a lot a raw compute power. They excel in pretty much everything that isn't gayming; performance there can be a little underwhelming.
Nvidia cards are usually a good bit better at drawing triangles, which is most important for gaymes.

Arch

It's actually good. Games are the problem.

Attached: Forza Horizon 4.jpg (2560x1440, 343K)

>>/sqt/
Any distro with any shred of popularity will do. If it can take RPM or DEBs it will work. If it uses any kind of package manager the drivers are probably already there.

It isn't hard to "excel" anymore. Even a old RX 480 or 580 can play any modern game at either 1080p 144Hz/FPS or 4k 60Hz/FPS with cracked up settings.
So yeah, depends on the price.

It's good for macOS

I heard Nvidia can do freesync now, correct?

Rx 580 release date: 18 April 2017, that's not old

Read this as a "I just bought one, reeee"

I want to buy one because freesync monitors cheaper. Should I?

Attached: 1524169593242.jpg (560x564, 59K)

>hotter than a 1080
>consumes more power than a 1080 sli
>performs worse than 1080 even in DX12 titles
its a niche meme gpu, get it if you already have a freesync monitor

Ridiculous benchmark, just like 99% of the bullshit ones posted here.

You really don't need "ultra quality" 2xMSAA to play Forza Horizon 4. The only reason to use those settings it so make a benchmark chart that gives the impression you need any of those high-end cards. You don't. Protip: The only thing you really need to pay attention to is the cross form 60 fps to lower; it shifts with higher settings but even if you fall for that stupid marketing bullshit and think you need ultra quality you can still see that any card from GTX 1060/RX570 and up are just fine.

This.

Probably not. Both g-sync and freesync are basically irrelevant if the card can sustain 60 fps. you'd only variable refresh rate if the refresh-rate keeps shifting between something like 30-50 fps. if you're playing on a 60hz monitor (most do) and it can do above 60fps consistently then it does not matter -- and the vega 56 can do that just fine.

I mean on a 144Hz monitor. I have a 60Hz right now and have always held 60+ fps but I'm guessing on a 144Hz you can feel when it shifts from 140fps to 110fps to 130fps etc

Only through an AMD GPU

Everything should just be smoother and pokier overall. You won't be able to tell any shifts unless you dip below a certain FPS or have wild jumps in FPS.

> Not really by that much though. GP104 is more power efficient but still a power hog.

> Sorry, 1080 SLI consumes was way more power than a single RX Vega 64

> RX Vega 56 and RX Vega 64 are already outpacing the 1080 in newer and future titles especially if you throw HDR which the Pascal chokes at. Ironically, all of optimizations for Turing will end-up benefiting Vega.

>posting the one of the 3 games on the market that favors Vega

Attached: JUST.png (1089x841, 757K)

Vega 56 is about 1070 performance. If you can get it cheaper than a 1070 it's worth it, otherwise just get a 1070. The only exception is if you plan on using freesync, although Nvidia seems to be supporting freesync with their most recent drivers.

Even if your average is always above 60 there will be occasional dips below 60, and adaptive sync can help with that. I agree that shouldn't be a major consideration if your card is OP for your resolution, but it is still a consideration.

>Used (no mining) ASUS Radeon RX Vega 64 ROG Strix OC for ~$514

Worth it?

Since I can't ask it in /pcbg/, will do here.
There was rumors Polaris 30(12nm refresh) coming in a matter of weeks.

Anyone knows anything about that or is it bull?

Probably. Roughly 1080 performance + freesync for about the same price as a 1080. As long as you don't have to buy a new PSU for it that is.

>As long as you don't have to buy a new PSU for it that is.
I have this: amazon.com/Seasonic-SS-520FL2-Fanless-Platinum-ATX12V/dp/B009VV56TO

Will undervolting it help?

Undervolting has been shown to reduce power draw quite a bit but it will still draw ~300W at max load. That leaves ~200W for your CPU, motherboard, and all the other components. Depending on your CPU it's doable, but running a PSU at 100% for extended periods of time is definitely not good for it. A GTX 1080 would max out at around 230W.

No. Get a RX Vega 56 and overclock it. It performs just as good as a RX Vega 64, especially if you undervolt it and use a Vega 64 bios.

youtube.com/watch?v=0PftkOaKfik

>Nvidia seems to be supporting freesync with their most recent drivers.

Elaborate.
The only example I've seen is that weird windows 10 DX12 using the latest win10 or the insider program.

I'm currently using Windows 10 1803 and 411.70 driver with a GTX 1080 and an LG 24UD58-B monitor. I have turned on extended freesync in my monitor settings, which previously caused the driver to crash, and as far as I can tell it is working. I have turned off v-sync in some games and have not seen any tearing.

vega 64 uses about 450 watts when oc
1080 maybe spikes to 300

really makes you think

Attached: 39313182_466413863872633_3701595831523082240_n.jpg (768x960, 62K)

Vega 64 is limited to 400 watts total board power draw, and GTX 1080 is limited to 230W. They are hard limited not to draw more than this. 1080 SLI at stock clocks can peak higher than Vega 64 could possibly ever draw, even overclocked.

tomshardware.com/reviews/radeon-rx-vega-64-water-cooling,5177-3.html

476 watt spikes. YIKES

It's easy to say something is 'bad hardware', but you have to consider the market it exists in. If you can find a good deal on a Vega 56 then I would certainly consider it - functionally most GPUs do the same thing largely, just a matter of how many transistors, GB and Hz you're getting for your money. You will know your use case but for basic gaming I don't see many scenarios where Nvidia has a real feature benefit if any. Freesync 2 is very good. And remember if you like cool hardware a small undervolt will make a huge difference for negligible performance impact.

Read the paragraph above the graph.

i was just memeing that vega64 consumes more power than 1080 sli but its real Y I K E S

It's obvious he isn't angling to be reasonable.
The Vega 64 does use a lot of power. So I wouldn't recommend it for low wattage power supplies.

See

wait for rx680

i thought that vega was like the 680 in that it was the next card after the rx series though

So does it suck for 1080p gaming or what? All the benchmarks I see are 1440p.

Attached: 1538363986884.jpg (740x713, 164K)

for the price and heat output, nvidia cards are way better

If you're looking at 1080p (and probably 60fps) , then look at RX 580 or GTX 1060.

You will be fine up to 1440p. 4K is pushing too hard.

No one really recommends Vega64 for gaming here.

Vega56 undervolted and overclocked tends to be in the 180-230W range depending on your settings. That's compareable to a 1070Ti or 1080 for also compareable performance.
Vega64's 8 extra idle CUs still waste power, and the stock voltage is really bad for gaming. Their voltage is fixed set for the worst GPU in the bin running the most demanding applications. Nvidia has software voltage control.

For the price and features, AMD cards are way better.
Pascal cards can't even run proper 10bit HDR without shitting themselves. What a joke to bring up heat when a card doesn't even do what it's supposed to do.
Don't even fucking support DX12 properly which came out in TWO THOUSAND AND ELEVEN. 7 years ago.

You really have no defense or argument, so you make a big deal about a few extra watts which isn't a big deal to anyone. Losing 20% FPS because you wanted better picture and got a new HDR monitor is, no the other hand, a big deal.

but I already have a 144hz freesync

>Don't even fucking support DX12 properly which came out in TWO THOUSAND AND ELEVEN. 7 years ago.

DirectX 12 was announced by Microsoft at GDC on March 20, 2014, and was officially launched alongside Windows 10 on July 29, 2015. 3 years ago. Still not great though.

see

It's an enormous overill for 1080p.

1080p cards are 580 and 1060.

>freesync
Vega 56 will give you all the FPS you want with your 1080P freesynch monitor.

>eternally waiting
never fails

An RX580 is not playing doom at 144fps at 1080p. Neither is it doing fallout, nor the majority of modern titles. My vega 64 with an overclock rarely sees over 120fps in ANY game except for doom with vulkan API.

time is money
if you don’t have money, you wait

It isn't that it favors vega, it's that it's optimized to work on both flavors of hardware. Most games are only optimized for nvidia hardware.

If all things were equal in this world, the vega 64 would consistently perform neck and neck with a 1080ti.

It's good compute card it's just overpriced

have fun waiting 8 months for 1070 type performance, which nvidia delivered OVER 2 years ago

It's not overpriced, it's just a compute card marketed to gamers. HBM2 isn't cheap, their profit margins are non-existent and it's basically considered a failure because below MSRP is actually losing money.

>g-sync and freesync are basically irrelevant if the card can sustain 60 fps
This is only true if you're willing to accept a serious bump in input latency. Using adaptive sync and limiting the game to 59fps is far preferable to turning on vsync. Various tests with high speed cameras are widely available to prove this.

(Personally I OCed my monitor to 63hz so that I can use adaptive sync at exactly 60- very handy for emulation etc)

so just dont use vsync

So 56 wouldn't be a good choice for a "gaming" GPU? Bummer.

Powercolor Vega 57 is the coolest running graphics card I've ever had. It doesn't even hit 65c usually

Don't listen to these faggots it is a great card

Both

there aren't bad gpus
there are bad prices

Then u get tearing

Ignore the refences to the Vega prices in the video. Its old and outdated as of now.

Vega 56 is $380 on newegg.


youtube.com/watch?v=4JiJud00IsE&t=685s

Attached: kiuh.png (1920x1080, 497K)

>references

I'm dumb

Reference Vega 56
64 BIOS
Clock set to 1600Mhz
Memory at 1000Mhz
Tweaked voltages and fan curve
350w max taking away 100w system idle usage (yeah I have a lot of shit running).
Measured with a wall wart whilst running Tomb Raider: Dagger Xian (Unreal engine)

Bear in mind some of that idle is also GPU so will be somewhat above 350w. But not by much.

Still. Vega is juicy, hot and loud. But fuck buying Nvidia with their shitty tactics. I'd feel like a washed up hooker touching their shit.

>Dagger Of Xian

Sorry kiddo, the RX Vega 64 uses at most ~350W of juice if you are overclocking and overvolting the fuck out of it under maximum load. The 1080 gets dangerous close to 300W under similar conditions.

If you have aren't pants on head retarded and are undervolting. The Vega 64 might close to 250W at maximum load. 1080 goes south of 200W with some undervolting.

The Vega 64 being a "power hog" is a meme harped by Nvidiots who don't realize that 1080Ti eats about the same amount of juice while new Turing SKUs eat even more power when fully loaded.

Modern performance GPUs have always been power hogs ever since G8x-R5xx with a few exceptions here and there.

much better than a 1070 and slightly better than a 1070 ti
you should absolutely get it for ~$400 or less
if more, don't
all of this assumes you spend 30 minutes upon getting it tuning the voltage properly so you do, in fact, get a 150W card

> When overclocking and overvolting like a fucking idiot which are the last things do on a GPU platform

> implying that you don't get similar results when doing the same thing with any of GP102/GP100 and any of the new Turing SKUs.

580 is essentially an overclocked RX480, which means it's not exactly new
It's still a capable card for 1080p ultra or 1440p high

RX580 (70Hz/1080p) owner here. Would absolutely get Vega56 if they were cheaper. Miners are starting to unload their GTX's here but I'm holding out for cheap Vegas.

At first I thought I'd oc the rx580 but I just settled for -10% power target and factory oc (msi gaming). It's nice.

Meme. Remember when people were buying six core bulldozers because it didn't suck as bad as the 8 core because it's power requirements were considerably less and wasn't killing motherboards?

You're buying the GPU version of that.

If its a decent price, yes.

AMD drop driver support too soon

I have 2x 6970 2g that still get 70fps in bf1 at 1080p with a modern cpu yet I haven't got new drivers for them since like 2015

nvidia supports drivers for like 10-15 years AMD drop it after 5 or so.

that imo means if you buy top end get nvidia because they will support drivers for it for a decade+ if you only plan on keeping card for 3years AMD is fine. but honestly hardware has slowed down so much getting nvidia for every thing is prob better.

your 1060 or 1080 is prob going to be useful in some build you do in 2030 having drivers for it then is nice

What the fuck you weren't lying. Damn.

Nvidia throttles performance for old cards but at least it's something (actually mean this, not irony).