Why did it suck? Wasn't vega supposed to be the next big leap in gpu tech?

Why did it suck? Wasn't vega supposed to be the next big leap in gpu tech?

Attached: AMD-Radeon-RX-Vega-64-Family.png (1280x720, 231K)

Other urls found in this thread:

youtube.com/watch?v=mhEu3RiCLpg&t=2612
techpowerup.com/240879/amd-cancels-implicit-primitive-shader-driver-support
32ipi028l5q82yhj72224m8j.wpengine.netdna-cdn.com/wp-content/uploads/2017/03/GDC2017-Advanced-Shader-Programming-On-GCN.pdf
rocm-documentation.readthedocs.io/en/latest/GCN_ISA_Manuals/testdocbook.html
computerbase.de/2018-05/mifcom-miniboss-titan-v-test/3/
twitter.com/SFWRedditVideos

A couple promised key features never materialized. The head of the GPU division promptly going on an extended vacation, then leaving the company after launch, is probably the result of that.
Its still not that terrible over all, it just isn't great. Vega56 with a little undervolt and tweaking will pull power like a GTX 1070 and out perform it, nearly matching the stock Vega64.

>Vega56
Lol, it's more expensive than a 1070ti.
Which features never came?

It doesn't necessarily "suck" it just consumes a shit ton of electricity.

Primitive shaders is the biggest of them.

Doesn't matter my parents pay for electricity anyway.

>Got one
>Had to push fans to 2800rpm to get anywhere near my stock RX480 crossfire.
>Basically unusable, unless headphones or streaming game.
>Mining for 6 month
>Still hasn't paid for itself.

It's a really overpriced piece of shit and is the only option from amd if you want something slightly modern. the rx series is too low end and don't provide an actual middle ground. I hope something new comes from them so i don't have to go green.

>Primitive shaders
Can someone explain what is so important in these?

Then who cares?

Primitive shaders were promised a driver-level feature.
In the end they ended up only shipping it as a new proprietary API only works for Vega and will be changed for Navi.

AMD cards cannot utilize all of their shaders/CUs. Primitive shaders were meant to help the cards utilize these unused resources for geometry processing in order to increase polygon throughput. This would be especially useful for Gameworks- and tesselation heavy titles.

Implementing them would come close to reworking the whole graphics driver and that is why they never materialized.

With manual tweaking and because of Freesync the cards are actually OK. The price and availability problem made them suck though.

I'm To be honest, it's a scam.
It cans run 3dmark OK and keep reasonable clocks for the duration of the test.
But beyond that, it just throttles back to 1350/1400Mhz territory.
I got a 1080Ti after that:
Up to 1900Mhz. Goes down to 1800Mhz under heavy load for a long time.
30% more performance "in benchmarks" for same power. That is, 50% more performance, because it throttles like hell.

> AMD cards cannot utilize all of their shaders/CUs.
ayyymd everyone

Honestly, they should just give up on GCN.
It's always been the problem, since 290x that the memory controller was eating too much juice.
This architecture needs massive memory bandwidth.
They thought thy solved it with Fury and HBM.
Then it was solved by doing mid tear graphics cars with 2Ghz GDDR5.
Then they thought HBM2 was the messiah, and it just didn't deliver.

You just can't feed that architecture fast enough.

It's clear as daylight. You get proportional performance increase with HBM2 oc, even with the GPU heat throttling.

Because some parts of it never materialized even as a Vulkan extension.
>Honestly, they should just give up on GCN.
GCN is an ISA.
With myriad implementations.

>It's clear as daylight. You get proportional performance increase with HBM2 oc, even with the GPU heat throttling.

I have a watercooled V56 with 64 bios and this is only partly true. You get linear gains up to 1000Mhz HBM. After that, the performance increase gets smaller and smaller.

Let me help you, you get into latency settings hurting your gains.
Not something you can change, unfortunatly.
You can on RX480s though, and it helps a bunch.

My HBM is always below 50C, so the latency strap should be tight enough.

Let me be more specific (I'm )
Vega does this thing where it loosens memory timings if HBM hits more than 80°C.
Hence why I set my mining one to 62°C GPU temp, because that's where the HBM will hit those temperatures.
If it goes beyond that, even if the Memory is reporting same 1025Mhz frequency, Mining MH/s will just drop 10%.

>below 50C
No, not on normal cooling anyways.
Needs explaining.

He has a fucking watercooled 56.

Already explained in .
The card runs under water with an EK block and 600mm of rad space.

Well, I guess this answers OP.
Card needs to be loud as fuck or water cooled to perform as advertised.
That is, not that great.
Also, it's more expensive than NoVideo equivalents that don't throttle.

Retail prices are higher than the 1070ti right now because of miners. MSRP was basically the same as the 1070 iirc.

Didn't see. I wish Jow Forums identified posters Jow Forums does.

Yeah the prices suck atm. I got mine at launch and with the waterblock it cost the same as a 1080 while delivering 1080 performance. If I didnt already have a good Freesync monitor, I would have bought an Nvidia card though.

raja pooped

Just wait until he takes a dump at Intel lol.

Well, I jumped boats and went 1080Ti after buying a Vega 64 at launch for MSRP (630€).
Even though I has a 4k freesync monitor.
1080Ti is just silent as fuck and much more powerful for 775€ at the time.
Here I am, wishing for an amd card that doesn't suck, that I can replace it with, and get freesync again, but I'm not really encouraged by latter news.

They never shipped the API. I can only assume the new guy in charge cancelled it all

PS didnt do geometry processing on the unused CU. PS promised early geometry discard so doing that on CUs would be impossible. Also that can already be done with opencl so AMD wouldn't need new hardware. The frontend has shaders too and PS replaced those. PS were new blocks that could process custom optimized shader code that would run significantly faster than older shaders.

youtube.com/watch?v=mhEu3RiCLpg&t=2612

>AYYMD Pooga 56 Nano
>It lags so much, even the Youtube chat can see it and every says lag

MY FUCKING SIDES

Checkout AdoredTVs talk about this.

techpowerup.com/240879/amd-cancels-implicit-primitive-shader-driver-support

dumb asses on Jow Forums can't even google a source dumb amd drones

AMD just said fuck it when it came to their GPU and went all in with their CPU. AMDs graphics are will always play second to nvidia.

And thats why Nvidia monopolized GPU market.
Literally no new GPU announced in 2 years now

*gcn

It doesn't suck, it works as advertised.

It is just retarded fanboys who thought it would be a 1080Ti killer.

Even a slight glance at microarchitecture made it clear at best it would merely match the 1080Ti at higher power consumption. At worst, it would be slightly faster than 1080.

>It doesn't suck, it works as advertised.


*POOR VOLTA*
Just stop.

I'm hoping now that AMD has Ryzen on track that they're starting to look at re-vamping their entire GPU division.

I never had a issue with my unit. Just undervolt it a bit and crank-up loaded fan profile and will reliability hit 1.6Ghz even after a 24/7 stress test of Furmark/3Dmark.

The problem is that AMD RTG cranks up the voltage too high and are too conservative with
fan profile. The same issues plagued the reference 290Xs.

I think that David Wang won't tolerate the halfassery in RTG to continue. He will rule them Pajeets with an iron fist.

> Butt devastated fanboy detected

Sorry, that Vega 10 wasn't the Pascal killer and didn't introduced a price war.

I don't feel bad if most of them end-up being bought up by miners at massive mark-ups.

Let you tell you something, Ampere/Tuning whatever Nvidia wants to call it will be a minor bump over Pascal at higher price points.

The days of rapid bumps and cheap high-end GPUs are over. It has been that way ever since GCN 1.0/Kepler.

Attached: _1889788_laugh300b.jpg (300x180, 15K)

Break out the whips. Keep those Pajeets in line. It's nonsense that a big company like AMD is always playing 2nd fiddle to someone. Intel is still rocking on their heels after Ryzen gen1. 7nm Gen 3 Ryzen will be the knockout. I want Nvidia in the same position. If only once.

Attached: 1517172034882.gif (457x454, 2.51M)

intel/Nvidia system is the only way. gave up on AMD so long ago they are shit,

Navi is going to kill Nvidia at discrete GPU market by making mainstream discrete GPUs obsolete. Navi is going be mostly next-generation iGPUs and semi-integrated solutions. Intel ramp things up as well to keep in the OEM world.

AMD RTG is going for the long game and ironically back to its roots when it was ATI. They are going after OEMs and mainstream shit.

The discrete GPU market is going the way of discrete audio cards.

Nvidia even knows this which is why they have been trying move away from discrete GPUs as their bread and butter ever since Fermi.

>AMD makes ad about their next GPU being a volta killer
>I-It works as advertised. D-Dumb false flag nvidiots. AMD n-never said that.
The double think in your amdrone retarded brain must be driving you insane.

32ipi028l5q82yhj72224m8j.wpengine.netdna-cdn.com/wp-content/uploads/2017/03/GDC2017-Advanced-Shader-Programming-On-GCN.pdf

A lot cores in GCN are unused and need a lot special programming for it.

Gaming drivers need too much work, AMD prefer high optimizer for compute Vega,in C++ ROCM platform.

rocm-documentation.readthedocs.io/en/latest/GCN_ISA_Manuals/testdocbook.html

>Why did it suck?
Price. The cards didn't suck - for miners. They were a massive success in the marketplace and a gigantic success story for AMD. Yes, really. The cards just weren't available in the "intended" market - PC building gamers - most of the time and the brief periods they were available they were beyond stupidly overpriced. And they still are.

I know Jow Forums likes to pretend these cards are failures. And that's probably right from a gaming perspective. AMDs numbers on the other hand tell a different story. It's interesting to note that AMD actually admitted that they estimated 10% of their total (not GPU segment) revenue came from mining.

> /v/tard is upset that his niche is no longer relevant.

Vega microarchtecture might still end-up be a Volta killer at general compute and learning stuff.

Attached: 1489347546806.gif (500x500, 1.86M)

I agree that iGPU's are becoming more and more relevant and powerful, but I don't see dedicated GPU's going away completely. Look at games coming out. They just keep pushing the boundaries further and further with game engines. Not to mention stuff like CUDA/OpenGL acceleration.

>VEGA WILL KILL VOLTA IN THE FUTURE
AMDrones are really pathetic.

Vega needs to be under volt to get any good performance out of it. Stock its just a mess. But with some tuneing it works as advertised. I like the card was a bit pricy but the performance is there.

computerbase.de/2018-05/mifcom-miniboss-titan-v-test/3/

>Wodurch auch immer, die Titan V dreht Kreise um andere Grafikkarten, sei es die GeForce GTX 1080 Ti von Nvidia selbst oder AMDs Radeon RX Vega 64. Im Baikal-Benchmark mit dem von AMD stammenden Radeon-Pro-Renderer erledigt die Titan V ihre Arbeit in genau 30 Sekunden. Die GeForce GTX 1080 Ti benötigt für dasselbe per Raytracing gerenderte Bild mit einer Minute und elf Sekunden mehr als doppelt so lang. Die Radeon RX Vega 64 erledigt die Arbeit in einer Minute und 14 Sekunden. Auch bei den weiteren Tests im Baikal-Benchmark ist die Titan V durchweg mehr als doppelt so schnell wie die GeForce GTX 1080 Ti.

>Beating Volta when it's half the performance in their own biased benchmark

TOP KEK

Its not miners, its the fact that hbm2 is so expensive. Barely anyone mines with vega because there simply isnt much vega cards

Bandwidth starved, primitive shaders ended up becoming explicit rather than implicit due to the effort required, tile based rasterizer not functioning as intended, Lack of a pre-fetch thread in non low-level APIs.

With all that said, it's not that Vega itself sucked, it was just bad for the price, especially sitting next to pascal gpus that could provide similar or better performance for the same / lower price while simultaneously producing less heat. Memory cartels really fucked vega hard as AMD was planning on paying less than half the current price for a 4GB stack of HBM2.

My vaga runs 1560s when i under volt it. Mybe you have a dud or you didnt bother to under volt.

/v/tard still convinced that Nvidia and AMD RTG give a shit about their increasingly irrelevant market.


Minerfags and craze are the clarion's call to your irrelevancy.

Attached: Comfy Cow.jpg (322x337, 41K)

[moving the goalposts]
[autistic screeching]

They needed to produce 10 times the number they actually produced. That would of kept prices down in the key time they had to sell before nvidia reacted.

Maxmium triggered /v/tard getting pwned so hard he resorts to shitty ad hominems

Attached: Monster Burger.jpg (474x324, 26K)

GPU Vega is bigger that 1080Ti GPU and use expensive HBM2.

>resorts to shitty ad hominems
like you replied to with and continued to ignore the issue that amd falsely advertised their gpus

Drink bleach, pajeet.

>implying that I'm a fanboy from either camp

> implying that I even give a shit about some shitty, poorly-coded rehashed gayming titles that required comically overpowered hardware to run at all which fanboy wank over

Attached: 1268332374499s.jpg (127x117, 5K)

Keep your insanity going, pajeet. Also you brought the games into this and every benchmark shows Titan V crushing vega in pure computational work.
Keep sucking that amd dick, hope you get rupees for it.

>The discrete GPU market is going the way of discrete audio cards
I agree but it ain't going to be Navi and it ain't going to happen any time soon. Compare the current APUs from AMD and the semi-integrated Intel solutions to just low-end dedicated GPUs and ask yourself if we're even remotely close. Audio cards and network cards are pretty simple things compared to a GPU and it doesn't take much to add that to a motherboard. We're fucking far away from 1080ti-level integrated graphics performance and that matters a lot when 4k is becoming common. Look at the ports on B350 motherboards, they all have HDMI 1.4. Just the basic port limits you from running even a plain desktop at 4k 60Hz.

This is why dp boards were a better idea

Raja did nothing and expected HBM to do all the work for him

you mean a shitload of their market share? GPU sales are a fucking huge chunk of both of their earnings

Not for long though, iGPUs are becoming good for masses. There will be no need to discrete GPU at all. Mainstream GPUs make up the lion's share of discrete revenue. Nvidia has been trying to away from discrete GPU as their bread and bread for a reason.

2020s will be the decade that see the end of mainstream discrete GPUs.

Mainstream discrete audio cards died in the 2000s.

Sad /v/tard needs to use a GV100 "reject" to justify the relevancy of his silly gaming benchmarks.

> BUT THE TITAN V CRUSES THE VEGA 64 AT GENERAL COMPUTE!

No shit, it is because its a GV100 "reject" with non-certified drivers drivers/firmware.

Attached: 480 SLI.jpg (415x275, 20K)

Yes it did. It'd have used shaders to detect invisible geometry, making use of unused compute to render faster. Silly

Impressive Wang

Not for their current CPU though

I feel the same way. I have a good feeling about David Wang

>unironically buying anything AMD

$3000 GPU is better than $500 GPU. Wow!

You guys are fucking retarded.

This is RTG. It's in like shanghai or some shit. Literally 99% Chinese

Attached: vega-design-team.jpg (1024x768, 147K)

>not having gpu-accelerating hdr
>not having free sync
>not having overclocking tool
>not having capturing with hevc codec
I'm happy with my vega 64
leave me alone

>pushing the boundaries further
This hasn't been true for a few years now. Look at the games making the most money. Graphics only have to be good enough and after that it's about content. PUBG runs like ass and looks like ass and yet it sparked a whole new genre.

>We're fucking far away from 1080ti-level
AMD won't be targeting 1080Ti level performance with these APUs. They'll be targeting RX 580 performance for the entry level and that's 100% achievable, especially if they use MCM. At that point, we'll probably see OEMs standardize their products around laptop boards that can fit both in gaming laptops, all in ones, and small desktop enclosures. I expect they'll do that because they can cut costs a lot if they do. That in turn will make these systems much better value than what we see now. There will still be the enthusiasts markets, but these markets won't be as profitable as before. Most people who want to play games are not enthusiasts and don't want to pay much or learn how to put their computer together properly, just as most drivers don't buy a car because they like cars and only actually have basic knowledge of how they work. Gaming computers are becoming a commodity and AMD is the only company positioned to treat them as such. I believe that's why Intel is ressurecting it's GPU department; at the moment, AMD is in position to make a ton of money on these small, cheap, "good enough" gaming boxes.

>mainland chink "engineers"
>drivers outsorced to India
Memes are writing themselves.

litreally kys
you're the most retarded amdrone on this board

clueless /v/tard keeps calling opposition "AMDRONE!/NVIDIOT!"

"MAH FPS! MAH GAYMES!"

^^^^^

Bingo, that's why exactly Nvidia has being moving away from discrete GPUs for the past decade.

They already seen the writing on the wall and their senior members are ex-SGI. They already seen first-hand what demand destruction does to a market.

It didn't suck. The marketing just went completely retarded and they didn't learn their mistakes from previous beefy cards. Vega is by no means bad, it can compete with the 1080 and can sometimes challenge the 1080Ti. The 7nm gaymen version end year or the next year will surely beat the 1080Ti, but by then, leatherjacketman can respond accordingly.

Wake the fuck up, Brian.

there isn't going to be any fine wine either and tests have been done to show this.

>2020s will be the decade that see the end of mainstream discrete GPUs.
This. The 2700u is already rivaling last gen discrete mobile GPUs. With the 7nm shrink, it can go close to a 1060 on the laptop or maybe even more. With IF, they can just put smaller, more efficient dies in one package to further close the gap. This is all on mobile, they could go beefier for desktop and kill the midrange and below.

>It's a really overpriced piece of shit
Not anymore it seems, prices are coming back down closer to the original retail level

At its launch retail price it was a decent buy and performed well for what still seems to be a half-finished product

kek

So.. Uh

Will the AMDGPU linux kernel driver make decent use of ASP or is it never ever?

Attached: 1523197729181.png (589x440, 393K)

>With the 7nm shrink, it can go close to a 1060 on the laptop or maybe even more
You're delusional. It can't even match a mx150

Who knows, there's an oceans worth of driver tech that was supposed to come with navi that just up and vanished. You would think this a priority on their part, but apparently not. Anyway, I wouldn't count on it, they seem to be switching gears for navi.

>2800rpm
the vram is a hard drive or something?

It doesnt matter if amd makes a better gpu then novideo as show by the 7xxx, r9 2xx series and the old rv chips, gaymer subhumans will always buy nivea. Thats why i really hope that amd leaves the spastics market and focus on the professionel one, so the faggots have to buy a shitty 1050 ti for over 200 Dollars and complain about high prices.
>fucking spastics plague

King Poo sabotaged RTG in an attempt to force Su into selling it to Intel. She didn't, so he left alone.

>7xxx
GREAT card if the drivers didn't break every single update. My system stability increased massively after moving to Pascal you AMDdrone.

Uh, no. This never happened. Raja's been in the industry for a long time working as technical management. He would know very well it takes more than just the CEO to agree to sell half the company's patent suite much less kill their entire mobile and embedded SoC business. What's more likely is Raja is responsible for Intel G with Vega inside.

I hate how babbies listening to unhinged babbies have flooded the consumer base. Go back to r/AMD