1 year after it's launch, what is Jow Forums's final veredict on the Vega cards?

1 year after it's launch, what is Jow Forums's final veredict on the Vega cards?

Attached: Sapphire Nitro+ Vega 64 Limited Edition.png (900x637, 412K)

Pretty good if you get one at a reasonable price.
Slightly less power efficient (than pascal) at high clocks, but excellent with a slight under-volt.

my computer is from 2006 and features integrated graphics
I know nothing of this technology

I bought one to put in my laptop, but it was too big, and there wasn't even a port for it to go in.
I even checked behind the screen!

My opinion is based on Linus Tech Tips, bitwit and Paul's Hardware YouTube channels. And few posts of 4chaners who says that undervolting Vega is good.
I have no opinion of my own regarding this question.

>2006
>integrated graphics

can you even run solitaire?

just buy a MSI B350 Tomahawk with 2400G

AYYMD HOUSEFIRES

Attached: power_maximum.png (500x1050, 56K)

>higher is better

AMD is absolutely shit at OpenGL (which is important in emulators among others). They have invested everything in Vulkan because they don't have the resources. Vulkan offloads a large part of the driver code to the user (coder of an engine).

So it's a good thing.

is the Fury X still good?

1080Ti 260W? That's fucking bullshit.
1080Ti easily goes up to 300+.

Yes, as long as you don't play games that choke on the 4GB of VRAM.

>10-20 nerds in their limited free time
Do you have any idea how much code Intel has poured into the Mesa project?

No because not everyone can make a Vulkan backend in a day. AMD are incompetent at making software so they willy nilly just abandoned OpenGL development.
The Mesa driver on linux is 40% faster. I'm not even joking. A driver that is coded by 10-20 nerds in their limited free time is almost 50% faster than AMD's driver.

>10-20 nerds in their limited free time
Do you have any idea how much code Intel has poured into the Mesa project?

They are not AMD in any case. The general argument that AMD are incompetent at software development stands.

>The general argument that AMD are incompetent at software development stands.
Your evidence?

As I said, their OpenGL driver is shit on windows. Almost 40% slower than Mesa.
They keep spamming open source projects on github that are half-finished.

Your evidence?

>spamming open source projects on github that are half-finished.
That's because they don't expect their projects to actually take flight. They post example code which can then be merged into existing projects.

Run it yourself if you are a doubting thomas retard. Or at least google the countless reports.
I'm not going to write thesis for you on a japanese fursuit board.

>Run it yourself
Run what?

They give the impression nobody gave them funding and they rage quit. Some of the projects look also gigantic with no future unless at least 10 people go almost full time on them (which is a very tall order in open source unless you are the linux kernel).

that sounds like a terrible idea with those vrms

Test OpenGL on Windows and then try the same on Mesa. You'll see almost 40% FPS difference (if the CPU isn't the bottleneck).

>Mesa
ohhhh, you mean AMDGPU-PRO?

To be more precise/add on that, not only if the CPU isn't the bottleneck but also if the GPU hardware itself isn't bottlenecking.
The main issue is that the driver on windows has a lot more overhead on the CPU side, which manifests a lot when the renderer is a CPU hog (very common in emulators for example, but not only).

No, that's another linux driver. I'm not very experienced in it but I get the impression it's slower than Mesa too.

Like , I'm calling big steaming bullshit here.

A couple of lines, yeah. But majority are basement dwellers.

When will prices drop?

On Saturday.

proofs?

My asshole

you are like a little baby
watch this

Attached: my negro 295x2 burning bright.png (650x425, 37K)

[Citation needed]

why?

>AMD is absolutely shit at OpenGL
If they fixed that I would go back to AMD and sell my memory starved gtx 970.

Fuck G-sync and fuck nvidia shitty practices and lies, a 580 would serve me well for 1080p or a vega 56 for 1440p

Pretty fast but a slight housefire hazard. If you find one for cheap I'd say go for it.

OpenGL works fine for current stuff. The problem is that old/legacy code which some emulators are still using are being deprecated. Nvidia does the same shit as well (Namely on the DirectX side)

It is part of the nature of aging software/hardware platforms. Don't expect 20-30 year code to run flawlessly on modern hardware especially if the code required discontinued hardware features.

>tfw I had 2x GTX480, 2x HD7970, and now 2x R9 390X
I like being warm

California is lucky this card was never released.

Attached: pqTl6[1].jpg (800x374, 117K)

>I think therefore I am.
Is this good enough?

Biggest failure since 2900 XT.

>GF100
>Dual Big-Fermi
It would've been Chernobyl all over again.

Not thaaat terrible, but it was way too overhyped.

There are AMD devs contributing to Mesa, retard.

bulldozer 2.0

Attached: 984984984984984.jpg (600x570, 16K)

I love my vega-chan and adrenalin-chan

Attached: 15226258.gif (600x338, 2.33M)

This. Performance isn't terrible, at least, for all these rumors about being broken.

Too little, too late, just like Fury.

Attached: 1523115543759.jpg (274x228, 19K)

>295x2
>675w
>4x 8 pin connectors
pic related

Attached: devil 13 r9 295x2.png (1000x390, 492K)

wait, you can use custom skins now?
pls teach

Attached: giphy.gif (627x502, 941K)

There was another prototype 480 that nearly used that much with only a single GPU.

Attached: gtx480_512sp_furmark_power_consumption[1].jpg (400x285, 15K)

put images you want into C:\Users\username\AppData\Local\AMD\CN\NewsFeed\number\NewsFeedImages and rename them
make sure Banner Advertimsements is on

based trips poster

>fourth port is empty

Overpriced
Very inefficient
Shit
Can't even keep up with the cards it was competing within even though it had the advantage of being able a much newer card

>Overpriced
False, MSRP puts them at a very competitive price, if you can find one for MSRP they're great value
>Very Inefficient
To call this a gross overstatement would be little, from personal experience messing with power targets can put the Wattage down by more than 100W
>Shit
In what way? The tech it has, like HBCC, Chill, WattMan, ReLive, Freesync is actually great.
>Much newer card
Barely a year newer and it does keep up in many games, while it outperforms Nvidia in compute.

I rate this post Nvidiot/10.

AMDrone detected

Substantiate your claims, Nvidiot.

It's not a port, it's an exhaust

T H I C C
In all seriousness though, I love this beast of a card.
Over 5000 cores, massive power consumption, only available with liquid cooling for obvious reasons.
AND the first ever card that could do 4K properly by itself.

>MSRP
You know that the R stands for recommended? It's useless merely hours after every launch ever.
>efficience
If it got that amazing potential, why isn't it tweaked like that by default?
>barely a year newer
>barely
It's like half a release cycle. If you put a 5th grader against a 10th grader, no one expects something close to similar results.

It was shit on release and it still is due to AMDs idiotic design on it. Muh tweaking is invalid as the big market doesn't tweak their shit.

If you could find a card for it's MSRP, it has god tier half precision compute compared to nvidia's cards over that price range, but if you don't undervolt it you're gonna cause a fire.
good hobbyist card for compute, and you can do some gayming on it if you want to.

>R for reccomended
Cringe