What's the deal with Vega 56? I was looking to upgrade my GPU to a 1070Ti...

What's the deal with Vega 56? I was looking to upgrade my GPU to a 1070Ti, but V56 is super interesting because FreeSync is much cheaper.

Is Vega 56 any good or is it literally hot garbage?

Attached: amd-vega-64-inhandmem1-4000x2670.jpg (4000x2670, 754K)

Other urls found in this thread:

github.com/ValveSoftware/Proton/issues/413
spcr.netlify.com/app/435150
twitter.com/AnonBabble

Depends on what you want to do, or accomplish with your pc.

56 is pretty much on par with 1070ti in games, note that it can be both undervolted and overclocked for best usage.
If you don't need CUDA then go for it.

Freesync isn't really good.
It's about equivalent to the 1070ti
Runs great with opensource drivers

It uses more electricity and outputs more heat (not temperature, heat) than a 1070ti. However, it supports freesync. If you don't live in the middle east like me, and find a Vega 56 under 400$, then go for it. Always check reviews because GPU models always have several manufacturers, and therfore some of them turn out shitty.

>freesync
>isn't really good
You're right, it's fucking amazing.

i like my vega 56

Attached: 1498889345173.jpg (671x665, 62K)

>Freesync isn't really good
It's literally G-sync but cheaper. The main difference between these two are the graphics cards that they support: Freesync for AMD; G-sync for NVIDIA

dumb frogposter

I always get flickering, and it's a well known and common issue. So no, it's not amazing.
I'm well aware, bud.

What model V56 do you have?

why do you think you need freesync lmao this garbage card can't even run olschool runescape above 50 fucking fps

>It's literally G-sync but cheaper. The main difference between these two are the graphics cards that they support:
The main difference is the quality control. G-sync, and freesync are different implementations of adaptive sync, and they perform differently.

is it true AMD GPU's have more compatibility issues since devs don't care about them?

Is it true Nvidia installs backdoor botnets in order to download their drivers?

Source?
Never heard any rumours like this.

When vegas' price will drop? Or when we get Navi???
I want to replace my rx580, it's not enough for my gayming needs on 1440p

AMD gpus are fine if you use linux. Just don't use their proprietary drivers.
CUDA, and nvenc seems to be better integrated in software though.

So if I use W10 I should probably avoid Vega?

inb4 botnet etc etc

>So if I use W10 I should probably avoid Vega?
No. Avoid vega if you like to use nvidia proprietary stuff like cuda, nvenc, obviously, but if you couldn't give a shit about any of those then have at it.

there are a couple stories where last minute changes gimp AMD performance in games. take those as you will.
the devs are not really to blame either. the tools nvidia makes are pretty neat to work with, but are know to fuck with the competition (see tessellation or hairworks)

I don't really care much for games but man, if only opencl was not such a cluster fuck.
haven't looked much into ROCr ROCm yet, but I hope this is a proper alternative to CUDA.

Power guzzling, coil whining mess at stock. But I undervolted mine to 1,000mV and it's a solid card now. Consumes 180W under max load.

Update your driver and monitor firmware. That hasn't been a common issue for about a year.

Navi won't come to consumer market, enjoy your stay.
Or upgrade to RX600 series (Polaris 30).

>Navi won't come to consumer market, enjoy your stay.
Yes it will. Vega 7nm wont.

>It's literally G-sync but cheaper
it's not.
It's G-sync but smarter.
G-sync uses a proprietary nvidia scaler that each monitor manufacturer has to buy off nvidia, raising the end price.
Rather than implementing it in the GPU and using standard monitor/cable support nvidia insisted on this hack solution to make more money.

Free-sync just uses the displayport 1.3 standard which is supported by their video cards. Monitor manufacturers simply need to use variable refresh displays, they don't need to buy shit from AMD to support it.

life is good, ubuntu, proton, lutris, rx vega 56, bye windows

Attached: Sapphire_201722174559.jpg (800x800, 67K)

If you can find it cheap, yes. But if you're paying RRP, you're a fucking mug. And this coming from an AMD fanboi+++.

Sadly, yes. Novideo is 80% - who's toolchain are they going to use, and who are they going to optimize for?

>told u i was an amdrone

>Freesync isn't really good.
Best a-sync method available right now. Also cheaper than G-Sync (while doing a better job).

>Use linux for 90% of my time
>Want to format all my drives into one large tiered storage
>Almost ready to forget about windows
>Proton and Lutris still cant run Divinity Original Sin 2
Reeeeeee
When will they make the literal best game on Steam run with Proton?

What about Vega 64? Can it be undervolted and overclocked like 56?
To which Nvidia card it compares?

Just make sure you don't get reference. Sapphire has a tasty Vega card.

best game on steam already runs on linux.

>What about Vega 64? Can it be undervolted
Yes.
>and overclocked like 56?
Not as much.
>To which Nvidia card it compares?
It sits between 1080 and 1080ti - making it not particularly good value.

You can flash a Vega 56 to 64 firmware and get similar results anyhow.

I've heard of the occasional bricked 56 trying to do that, but it may be user error - and of course, digging up details is hard.

>similar results
not in opencl, I assume?
I'm interested to try meme learning with opencl
Not sure what to get (56 or 64)
I can get a MSI 56 for 520€, but 64 price starts from +700€

Probably gonna wait for new AMD gpus

Not heard of. It was so easy too. Made sure I got the correct firmware for my reference 56. Took literally 30 seconds and reboot. Plus on the ref there is dual BIOS so if you fuck up you can boot on the second BIOS, flip the switch over and reflash the original BIOS.

Have a look at:

github.com/ValveSoftware/Proton/issues/413

and

spcr.netlify.com/app/435150
spcr.netlify.com/app/435150

Attached: linuxflag.jpg (1000x850, 86K)

take not it consumes 2x the power of a 1080ti

My mom pays bills anyways

V64 can be overclocked way further than V56. I'm sitting at 1700Mhz core and 1050 HBM on my V64 stable, getting awesome frames in all games.

>Freesync isn't really good.
why?

This is Nvidia marketer FUD. It only consumes much more if you set the power target to 50%, if you set it to 15% you get the exact same performance but power usage never exceeds 255W.

Underage banned

I'm 31

>still posting "vega 6 gorillion shoah watts" meme in [current year]
yikes!

Do I need 1 kilowatt PSU to run Vega 56?

No, that's a xeon you're thinking about.

Get a fucking job you lazy assed faggot

>If you don't need CUDA
why the fuck would anyone ever need cuda?

because its got better performance in many things.

If you set voltages and clock speeds yourself, a Vega56 uses around 180W and is on average on par with a 1070TI.

breaking someones tripcode

>he doesn't utilize gpu to train his nn

Attached: 15183787888070.jpg (640x1136, 122K)

>AMD GPU
>Is it literally hot garbage?
Gee I wonder. [spoiler]It's hot garbage[/spoiler]

>Navi won't come to consumer market, enjoy your stay.

Navi is specifically designed for the new Playstation, so yeah, it'll be coming to the consumer market.

To automatically clean, and upscale photos of your waifu.

>he doesn't use an opencl kernel to do gradient descent

Ohhh, didn't know about this. Still, I feel like cuda is more widespread.

>Still, I feel like cuda is more widespread.
It is. NVIDIA makes it very easy to work with cuda.

Is the whole space heater meme true? Does it actually make your entire system hotter?

Back to /v/ with you, vermin.

so does the 2080ti

Vega 56 pricing starts below $400 in the US. You should probably recheck your local prices.

Difference in shader performance between a v64 and v56 with matched memory / core clocks is ~5%. Not sure on computer performance, but it should be something like that as well. The issue you need to concern yourself with is that v56 has slower memory by default, so a lot of aibs use micron hbm2 as opposed to Samsung hbm2 since it doesn't need to clock as high to meet spec. (micron hbm2 clocks like shit.)

I rock a ref. Vega 56 flashed to 64 in a custom loop in combination with a high pixel density AGON AG241QX 24'' 1440p 144hz monitor and I have a great user experience. Freesync doesn't cause any issues for me, but with older panels flickering in the lower end of the sync range was common. This is solely monitor dependand though.

If you want a good Vega 56 without having to coolerswap or repaste it afterwards, get the Sapphire Nitro+/Pulse or Powercolor Red Devil/Red Dragon. If you're lucky and get one with Samsung HBM you might be able to flash it to v64 and get 1Ghz+ on the HBM, pushing the card to GTX 1080 performance.

Its Hynix HBM, not Micron. Some Hyinx also scales with additional mem voltage from bios flash but not as good as Samsung.

Mom says I can be unemployed as long as I want.
Don't be jelly.

>below $400 in US
I'm checking my yuropoor local shops every week, and cheapest goes for +600€

>250-300w gpu draws half the power of a 200-250w gpu.
Even with heavy overclocking in mind, user, the v56 wouldn't be able to go much passed 50% more power draw than a 1080ti (without shunt mods on the 1080 ti)

HAHAHAHAHAHAHA, no. OpenCl has always been better.

In newegg there are ones as low as €350. Some Vega64s are even cheaper than vega56.

talk properly. from a technical perspective yes but the hardware behind it though

Ah my bad, couldn't remember if it was hynix or micron, just took a guess. Anyways, unless amd changed that. You can't custom bios flash Vega anyways.

>linux

Attached: 1518518604791.jpg (125x121, 2K)

>he doesen't undervolt his cards
Oh wait, novidia goys can't into undervolts

You can still flash a signed bios though which is actually sufficient as further tweaking can be done with registry pTables.

Install Gentoo.

Vega 56 + Ryzen 1600 yay or nay?

Freesync (really just AMD's brand of VESA's VRR spec) and G-Sync provide the same experience if the monitors are equal. (Seen it first-hand)

The difference is that G-Sync spec requires mointors to support ULMB, 60-144Hz range and variable overdrive while they are entirely optional for VESA VRR spec.

The middleware chip in G-Sync is an artifact before Displayport 1.2a spec was finalized so it could be used in pre-Displayport silicon a.k.a Kepler. It is now practically obsolete, but only used to keep a "walled garden" ecosystem for Nvidia. Unfortunately, for Nvidia it is just self-defeating (Mointor vendors hate it which is why G-Sync mointor SKUs are outsold by Freesync SKUs by a good margin and are less less abundant.

No, got Ryzen 2600 instead.

>tfw Samsung HBM2 Vega 64 overclocked and undervolted to 1700Mhz & 1050Mhz with 144hz Freesync monitor and using Enhanced Sync

THANK YOU BASED AMD

Attached: 1377165527333.png (1000x1110, 226K)

>implying just having a job is good enough

But I already have a 1600. I don't want to swap until Zen 2 releases.

get a Vega 64
56 is bandwidth bottlenecked

Poojeets have clocked in. RIP Jow Forums for today.

yes

>upgrade my GPU to a 1070Ti, but V56 is super interesting
I'll make it easy for you, just buy the one that happens to be cheaper right now in your area. On newegg there's deals on the Vega56 but in my country it's priced like a nvidia 1080ti which makes it a non-choice.

>AMD gpus are fine if you use linux
not sure I'd agree with that, you'll run into issues from time to time and sometimes new kernels break the card and then agd5f and mrcooper from AMD will tell you Do You Even Bisect and then you have to compile a bunch of kernels and bisect and tell them what the error is like you work for AMD or something.

there's many examples of nvidia stepping in to "help" developers "optimize" their game and suddenly hairworks is doing a ton of stuff when there's nothing with hair rendered in the frame. make up your own mind whether this is intentional or not

sorry m8 i was sleeping. it's a powercolor red devil 56.

Attached: 1526974831857.jpg (880x870, 101K)

AMD pricing is very interesting, for example in my country (Spain) Nvidia is always massively overpriced, both compared to AMD and compared to the other countries in the EU. When the Vega cards just came out I found V56 and V64 for 450 and 550€ at local retailers, while a blower 1080 would cost 650 and a good one 700€, and 1080Tis went for between 800 and 1000€. Now the 2080Ti is priced at 1300 to 1600€ here, while Vega is still at around MSRP, I just saw a flash sale this week for a non reference Vega 56 for 420€.
Meanwhile it seems like in the US AMD is always in shortage and much more expensive than Nvidia.

how much so?

Can get a used Vega 64 Asus STRIX OC for ~$560 (europe). Or wait for Vega 2 with 32GB HBM2?

>look global shipping at Newegg
>Says they ship everywhere
>a list of Europe countries
>lists inferior neighbor countries
>my country isn't on the list
oh well, gotta wait till prices drops even more or maybe I'll just get overpriced 56 or 64...

Which one should I look to? Asus? Sapphire? MSI? Tul power?

Nice fren that's actually the one I was looking at getting

Attached: 1537830506471.jpg (1346x960, 99K)

Just don't get a fucking blower version.
If you stress the card out it will sound like a fucking jet taking off in your house.
Serious don't unless its as cheap as an RX580/480

vega 56 is comparable to a 1070ti

I remember the Vega Frontier Edition was meh because of beta tier drivers but has that improved?

Yeah. Drivers aren't BETA anymore.

Figured. Just sucks because I have yet to find a benchmark using those new drivers. I guess I should just see it has a Vega 64 with 16 gigs.

So I could
>get a Vega 56 on sale, which could be cheaper than a 1070
>undervolt it (assuming it’s easy) to make it outperform a 1070 Ti without consuming more power
>boost performance further with Vega 64 bios
Is that all correct? Sounds too good to be true, so what’s the catch? Would I need additional coolers? Or are the stock ones on the cheaper models fine?

get a higher end model for a beefier cooler