Why was vega 56 and 64 such schizophrenic memes?

why was vega 56 and 64 such schizophrenic memes?

>destroys the 1070 in some cases
>matches the 1070 ti in a few cases
>falls flat and gets btfo by a 1070 in some cases

Attached: [Commie] Yuyushiki - 11 [BD 720p AAC] [D17A1441].mkv_snapshot_10.18_[2018.11.21_11.54.00].jpg (1280x720, 130K)

Drivers and shitty gayme devs. On paper the FP32 math perf alone should have left novidia mortally wounded. But all in all the 56 managed to match the 1070 and the 64 match the 1080. However the biggest crux in vega has been lower quality binning to allow more GPUs to survive which is why it consumes so much power and why undervolting them is possible for like 90% of them.

Attached: rx-vega-64-vega-56-gtx-1080-gtx-1080-3.jpg (840x500, 71K)

pic related is one example were as much fp32 was squeezed out as possible for the games at the top of the chart

Attached: Slide1.png (1282x1768, 91K)

Drivers on launch for Vega sucked. They improved something like 10-13% since launch on average.
Also apparently they might be getting a geometry pipeline update that improves another 0-5% in some games. It's not NGG, but it's something.

Vega didn't even have HBCC on launch and a bunch of shit it was supposed to have.
Vega64 pretty reliable beats the 1080, and crushes it in HDR, now days. Although to get the most out of it still requires tweaking and the default BIOS settings are retarded. So it's this card that enthusiasts like (and wish they could love), and normies often hate.

I get 90-130fps out of my Vega56 in R6 Siege at 2560x1600. So it'd be probably 100-144 at 1440p. That seems pretty fucking good for a card I paid $285 for new.
I didn't even plan to upgrade until Navi, and get 4k HDR high refresh rate around that time. But the deal was too good to pass up. Vega56 for the cost of a 1060.

Were you able to undervolt? Hear a lot of lads got like ~1,100mV for stock base.

*stock turbo

It was supposed to make up for being shit by doing fp16 shaders twice as fast.
But only 1 or 2 games ended up supporting it.

>2560x1600
Does this still exist?

That was never really intended for games, vega really shines when used for opencl stuff though cuda has more support.

Attached: 90098.png (650x337, 25K)

From distant memory, only Prey and FC5 were supposed to have it.
Didn't really check to see if they really worked better on AMD.
It was supposed to fuck NVidia over, because they'd be stuck with the same calculation rate as fp32.
Now, don't quote me on that, but I think Turing does fp16 twice as fast as well.

AMD sells hardware, and they prefer that everyone else handles the software side of things.

I got it down to 960mv at 1631mhz max, though more typically it's boosting in the 1530-1580 range at that.
I got Hynix memory but I got it to 935MHz fine.

I just play my stuff at 60fps locked, though. So it downclocks more, and I'm typically using 25-125W in my games. In a lot of games, like for Two Point Hospital at 2560x1600@60, the fans don't even come on at all because it's only at 30W. Obviously that's a low requirement game, but it's still nice looking and 3D and at a fairly high resolution to only use 30W, so that impressed me.

My benchmark scores are actually a lot less good at these settings than some others I tried, but power efficiency at 2560x1600@60 is much higher, which was really confusing when it came to doing the tweaking.

I'm actually on a 1920x1200 monitor using DSR.
Too many games with deferred renderers look so shitty at anything under 1440p now days.
2560x1600 with AA off on a deferred rendered game looks better than 1920x1200 with AA AND usually runs better.
Acer has some nice looking new 1920x1200 75hz monitors if you want to join in on my meme. Acer B247W or something. There's a 21.5" that's much cheaper.

While that's true, Vega still holds well against the 1070Ti and 1080. They're notably much much better for HDR output, and I had planned to get an HDR monitor.
If you plan to get an HDR monitor, Pascal is a really stupid choice given both the poor performance and the ridiculous cost of monitors.

Technically, shader compiling optimizations should be able to make some use of FP16.
Part of the problem is OpenGL/DirectX themselves in that they just use fp32 for everything even when they really shouldn't have which requires the FP16 shaders to use intrinsic functions.

The terrible blower cooler killed it.
And the lack of third party options for a fucking year.
But we now all know it was the GPP program at play.
I've seen it drop bellow 1300Mhz in real use.

>960mV with 1,500MHz+ sustained
god dam

Attached: Gotta_mow_fast.webm (259x186, 295K)

>1920x1200 75hz monitors
Too late, I'm in the 4K wagon, and enjoying the ride. Let's be clear, DSR is nice, but it's nothing like native resolution.

Depends on the title and what part of the hardware it hits hardest.
Vega isn't particularly good at various AA implementations, it has the front end issue with only processing 4 tris per clock. If neither of these bottlenecks are an issue, then the Vega64 has no problem competing with the GTX 1080, sometimes getting near the 1080ti.
This has actually been a trend for years. The 7970ghz could come near the original Titan in some games. GCN has always had its nuances and Nvidia worked between them.

I don't think I can actually get it completely sustained at 1500MHz+. Maybe at +50% power limit and maxed fans...
But like I said, I'm more into the quiet. You can tune Vega to be very efficient. My goal was just to have it outperform the 1070 at generally less power consumption, and it does that, as that's all I need for 2560x1600@60.
I see it jump to close to 1600MHz occasionally, but if I let it run flat out without a framerate cap it either hits a thermal or power limit under the settings I tried.
I know there is another user with the same Red Dragon with Samsung HBM who apparently has his pinned at 1590/945 flat, although at higher voltage.

>DSR is nice, but it's nothing like native resolution.
I didn't meant to imply that.
It's supersampling. Still limited to native pixel count.
However, every game looks better except a small handful which don't properly scale their UI.
Font rendering on everything is absolutely gorgeous, even on things that normally have shitty font rendering.

7970 Ghz is actually within 10% of the 780ti (which was beter than the Titan) in most games from the past 2 years.
I think I've even seen the 1050Ti above the 780Ti in some games, while the 7970 pretty consistently matches/beats the 1050Ti albeit at way higher power consumption.
>There are people who bought a 680 instead of a 7970

I had a golden sample of a 7970 myself, the Lightning. It did 1320MHz stable for a while.

Still amazing lottery chip win m8, most will probably only get like 100-200mV less with undervolting. Congrats, hope I get as lucky as you.

I miss AMD, really.
I got vega64, but it was worse than my crossfire rx480s.
Now I has a 1080Ti, a Freesync 4k monitor monitor, and waiting for options from team red. Or for NVidia to adopt adaptive sync already.
The feature I really liked that's not mentioned often is Chill. Must be voodoo magic, because I didn't notice any adverse effect at all. It only came with a quieter card.

We're all crossing our fingers for 7nm navi but I get a sinking feeling 2H 2019 will only get us slightly better rx 590 performance for les power.

Pajeet drivers, Vega team get ripoff to Navi for PS5 tech

>slightly better rx 590 performance for les power
I wouldn't expect quite so little, but I also wouldn't expect AMD to top a 2080 Ti, so NVIDIA will likely (and sadly) still remain the only option at the top.

The new 7nm Vega given its clock boost should be just below the 1080ti, and it should get there under 200w.
Navi shouldn't have any problem competing with Nvidia's newest in raster rendering.

Yeah, I think for 1630MHz, people usually are in the 1030-1070 range.
It also seems like yields might be better now days.
Like originally people were rare to get over 925 for Hynix HBM, and usually closer to 900. I can actually get 945 if I keep voltage on auto, for some reason.

But anyway, as far as the core clock, it really doesn't matter much since it's not like you get 1% higher fps with 1% higher core clock. What matters is balancing the TDP and thermal limits so it actually stays pinned at high memory and core clocks when you're looking for an actual good benchmark score.

Vega64 should easily outdo crossfire RX480s on average.
It's like +75% higher performance, right? CF rarely scales much above +70% if it works at all.

Rumors have been 40 CUs.
40CUs and plus 25% performance per CU from the die shrink should be a bit better than 1070 at a minimum.
But matching/beating Vega (and at lower power) is actually not unrealistic if it has NGG working.
Big die Navi will surely top the 1080ti. Come on.

Probably same 295W TDP, actually. And I'd wonder about the drivers since it's a pro card.

>Big die Navi will surely top the 1080ti. Come on.
I'm not sure big die Navi is actually coming in 2019, though.

>Vega64 should easily outdo crossfire RX480s on average.
It didn't.
Was getting less 3dmarks and noticeably noisier pc.
The thing is, crossfire didn't have the cards running 100% all the time. So they stayed at 1266Mhz all the time. They were AMD blower design too, mind you.
Vega, however, was heat throttling back to 1400Mhz and a fucking jet engine.

The clock speed only increased from like 1500 something to 1800 peak boost clock for the 7nm Vega part, and 14nm Vega only had high voltage for sake of yields. Even the 14nm Vega64 is rightfully a 250w part, which can be brought down even lower with a good die.
No way the 7nm part draws 295w for such a paltry clock uplift.

Vega64, reference design, comes with a bios switch. And default is 225W.
I can't imagine it going to 300W, as 225W was already too loud.
They just shouldn't have sold the air cooled version and gone for default WC on this shit.

>I'm not sure big die Navi is actually coming in 2019, though.
Yeah, it'll probably come out around the same time as the 3080Ti in 2020.
Doubt 7nm yields will be cost effective for large consumer GPU dies in 2019.

There are different stock power profiles for Vega. Pretty sure the turbo profile is 295W TDP.
I've seen 300W TDP quoted for the MI60 at that 1800MHz.

Idk. The FE is really beautiful. You'd think they could at least design a quieter blower.
And plus you can undervolt them enough to get them to reasonable noise even on a blower, just not really quiet.

lmao 3dmark scales crossfire a lot more than games. Come on, lad.
>Vega, however, was heat throttling back to 1400Mhz and a fucking jet engine.
So undervolt it ...
And/or replace the cooler.

Attached: morpheus II v56 mod.png (796x378, 538K)

>you can undervolt them enough to get them to reasonable noise even on a blower
Well I couldn't.
I've tried. Was getting blue screens and system hangs all over the place.
It did have some effect, though, as the maintained clock was somewhat higher.

Let's be honest.
Remember how we were all waiting for third parties designs and how they didn't happen for a fucking year?
I'm sure Vega is just fine now on a real cooler, but GPP prevented vendors from doing so.
You might remember GPP. What do you think it was all about?

That 300w TDP was from leaks which also said that it'd be 20TFLOP, not 14 something.

>lmao 3dmark scales crossfire a lot more than games
Doesn't apply to crossfire.
As I said, vega will drop off in frequency after a while.
Crossfire magic has your 2 cards running max clock all the time minus efficiency.
I played a good bunch of Mankind Devided @ 4K on crossfire on High settings.
No setting on Vega would have it shut up.

>I've tried. Was getting blue screens and system hangs all over the place.
It's really tricky. It took me a few hours. I hear it took some others some hours too. I got a few hard crashes where there was no video output myself which made it a pain.
You could try something like +15% power limit, dropping to 1050mv, 1000MHz memory, and -5% clock speed. Then keep trying 10mv lower.
I'd be surprised if you can't at least drop power like 15% when dropping clocks 5%. Having a bit lower clocks bit pinning at those clocks is better than a higher limit that it's not reaching at a cooler voltage.


>As I said, vega will drop off in frequency after a while.
Yes, as pointed out many times in this thread. It hits thermal/power limits and throttles. The goal to get more performance out of Vega is to stop the throttling. On a blower that's extra hard especially while keeping it quiet, but still doable to a degree.

Anyway, usually when people buy a blower they do so with the understanding that it's either going to be loud, or they'll be replacing/modding the cooler themselves.

>when people buy a blower they do so with the understanding that...
It was the only choice for a fucking year almost.
Am I talking into the void?
GPP was all about preventing vendors from doing real cooling on Vega.
Nowadays vegas are just fine.

Why were 1070 and 1070 ti such schizophrenic memes?
>destroys the vega 56 in some cases
>matches the vega 64 in a few cases
>falls flat and gets btfo by a vega 56 in some cases