Did anyone buy RX Vega? I don't hear much about it at all. Seems like everyone ended up with GTX or RX 4**/5**

Did anyone buy RX Vega? I don't hear much about it at all. Seems like everyone ended up with GTX or RX 4**/5**

Attached: vega.jpg (800x453, 44K)

My friend did, he dont like it but uts whatever for him

Is HBM2 a meme?

Yeah, but mostly because of the mining craze that made Vega virtually unavailable. That's how I ended up with my RX 580 too.
But there's a fair amount of people who got Vega before the mining blew up so there's that.

No, but it should be used carefully by GPU designers because it's special in some respects - like the very constrained availability that makes using it in a mainstream GPU like Vega a supply nightmare.
Nvidia, in contrast, went the way of using GDDR6 in its high-volume consumer GPUs and only uses HBM2 in the professional line where no one gives a shit about the price being inflated a bit to compensate.

I think almost anyone would buy a high end NVIDIA over this

Anyone with a noggin yeah

I have a Vega 64 :^) had to lurk and be quick to get it at launch and under MSRP.

HBM (in all of its forms) is the way graphics memory will go. As user says AMD probably expected supply to ramp up faster than it has and for a high volume part liek vega that is a major issue.


The mining craze made AMD a shitload of dosh.

>AMD making shitload of dosh
O i am laffin

I'm waiting for it to go under MSRP I really do want one my 390 has been holding me pretty well though.

No, it's got a few orders of magnitude more bandwidth per die than even GDDR6. It also has better latency characteristics than DDR4, to the point that someone proposed using HBM as a L4 cache. It's the best memory for both CPUs and GPUs, and it'd be amazing to see it in something other than VEGA and Titan V.
The place I see it hitting the hardest would definitely be mobile. Zen 2 + VEGA16 7nm + HBM3 in a single ~65w package would be a killer high-end mobile chip, outperforming every other solution (save for GTX Max-Q solutions) on the market, even assuming both Intel and Nvidia can crap out a +10% performance per watt mobile line in the next year.
The 2700u is a great performer vs equivalent 15-25w Intel CPUs, even with heavily gimped single channel memory. It also slugs way above its level when given dual channel memory, especially in the GPU department. A node shrink and core redesign with the high bandwidth and low latency HBM provides, along with a beefier GPU, would absolutely wreck mobile competition (especially considering HBM consumes less power than DDR, allowing for even better battery performance).
I can see that product coming by 2020 as a halo no OEM picks up, except for a few kickstarter'd SBCs with too much z-height to fit in a reasonable laptop enclosure (looking at you, udoo and sapphire). The lower-powered versions will probably be stuck in some 14" thinkpads, though.

It's really good but not cost effective

>tfw still on a HD 6670

Attached: Q5UAXpY.png (542x602, 43K)

Imagine if you'd bought aq 290x in late 2012/early 2014 - you'd still be smiling.

I bought a Vega 56 the moment they launched, sold out in 18 seconds. It's been a great card, I run a 3440x1440 ultrawide and have no problem pushing 60-75 fps in every game I've thrown at it.

Would probably have been better served ponying up an extra couple hundred for a 1080ti but I am a shameless AMD fanboy.

Attached: 1489981669274.png (1920x1080, 2.38M)

too expensive where I live user, I built this HD 6670 system in 2013, and it cost the equivalent of 650 bucks

Damn, donate some blood for a few weeks and get a 1050 Ti

I bet it's South America.

yes
you don't get money when you donate blood in here

I'm still using an 8GB R9 390 I bought in 2015. For 1080p it's still plenty capable.

Should mention I paid like $280 for it.

I remember everyone saying they were about £50 too expensive when they were first released and everyone that was gonna get one was waiting for third party cards with better cooling.

Then the mining craze hit and prices just went up and up, and it seemed that everyone who paid SRP on release day actually made the right decision.

Technically if you have a FreeSync monitor and want to use that feature, they are a better buy than the nVidia equivalents, even if at face value they are a tad worse at the same price point.

No thanks OP, don't want a housefire at this time.

Power bill in August is high enough, and besides AMD screen tearing on linux is just bloody awful.

>besides AMD screen tearing on linux is just bloody awful
Not using Wayland baka desu senpai

gr8 b8 m8, I r8 8/8

Attached: 1482171102026.jpg (514x600, 41K)

well what do you expect since they cost a fortune?

>when your $3000 card gets btfo by the more efficient vega6gorillionwatts^tm in mining but you still have your gaemen benchmarks

Attached: eksdeeface.jpg (200x200, 10K)

"housefire" eh?

Attached: housefire.png (842x449, 73K)

Yes. GDDR5X/6 cards reach similar speeds.
Sounds like you don't play games. I would probably get a card more performant than that for 20-30 €.
In midrange you should have bought 2014-15. The only difference is more efficiency.

I have a 1080 Ti and RX 480 8 GB, both are good for their usages. The 1080 Ti is a 4K powerhorse, the 480 is a budget card that can still handle older titles in 4K and offers enough VRAM. It is also very efficient if properly undervolted.

which model of rx480 are you using?

Miners basically killed this card for consumers by pushing the Vega 56/64 into the $1000 price range for an extended period of time. Prices didn't even begin to normalize until May of this year and by then whispers of Nvidia's new cards were putting a halt to most consumers upgrading their cards

No one with a brain will use a Titan to compete with any of the Vegas, just lol. A 1080 non ti outclasses them both

Sapphire Nitro+ 8G

Seemed like the best option for dual fans that fit in my HTPC, up to 27cm the card is 24cm. I looked into 1060s but they were more expensive with only one fan.

>A 1080 non ti outclasses them both

Attached: 745.png (327x316, 211K)

undervolt and OC nibba and you get basically 1080 (non ti) performance

Yes, I bought one for MSRP prices, before miners took them all.

Attached: CIMG5195 res.jpg (2880x2160, 2.54M)

I have to admit, it's a handsome looking card. I've always liked brushed metal (especially on round things like knobs) and the understated red accent really sets it out. It's the kinda card you can post pics of in 10 years and it will still look good. I kinda one want...

The silver is actually painted on. I know someone who chipped the paint and it's black beneath. But yes, it's easily the most beautiful card I've ever owned.

Attached: CIMG5197 res.jpg (3413x2560, 3.25M)

>buying a blower card
>painted shroud
Christ above man is it that hard for AMD to make their own 'Founders Editions'?

cope away nigger these cards run hotter, need more power and play worse in gaymes

The blower isn't that big a deal. Whenever it has to spin up, I'm playing a demanding game, at which point I'm wearing headphones. I basically don't hear it at all.
As for the painted shroud, it's still metal. Aren't Nvidia's founders cards made out of plastic?

>tfw still /Fury X/
>tfw still quiet and comfy @2k

You can't change my mind.

>Two 8 pin power connectors
even the 1080ti founders edition didn't need that, JUST

Questions to the unvervoling experts;

>bought RX Vega 64 Asus ROG STRIX version (heavily discounted in my country and it's also the only one in stock) (apparently it's one of the worst vega 64 cards according to the internet)
>can finally play games at ultra settings and it makes gaming so much more enjoyable
>it's loud as fuck out of the box even though GPU core temperature is 65 (think the VRM or memory temp goes a lot higher though)
>too afraid/stupid to undervolt
>reduce powerlimit in Radeon graphics control panel instead
>now the fans are standing still half of the time and then spinning very slowly the other half
>card stays at base clocks now (around 1250 core and 800HBM) instead of 1600 core and 945 hbm
>don't care because it's a signicant upgrade from my previous card, RX 480 reference card (which has been passed down to my other computor)
>fps loss from reduced powerlimit isn't that significant and i'd rather take slightly lower FPS than a loud as fuck card

What's the option here. Should i research how to undervolt or maybe even use the card's two 4-pin fan headers and remove the fan shroud and attach two 120mm fans instead (the card has two 4-pin fan headers and Asus says they can be temperature controlled to the graphics card's temperature) (see pic related; it looks kind of easy to remove the fan shroud)

Attached: 5406279_6[1].jpg (1296x900, 175K)

I'm still using a gtx480 I found in a closet at school

Don't fuck with undervolting just yet. Try adjusting the fan curves. My Strix 1080 was getting up to 80C before the fans kicked in momentarily which I felt was absolutely retarded. I went and set them so they ramp up much sooner and now they stay at roughly 60C when gaming

Sell it and buy a nice Nvidia card that doesn't go WHIRRRRRRR upon booting up.

i feel ur pain
- fellow HD 6870 user.

All new cards have this retarded feature of fans not spinning until the cards hit 60c. I keep my fans running all the time, my 1070ti doesn't go above 62c in Tomb Raider 2 maxed out.

I am doing 1650Mhz core clock at 1115 mV and 960Mhz HBM @ 1000 mV with 50% power target and it runs great.
You can undervolt to 1000 mV at the core clock if you put it to like 1500 Mhz.

Laptops have that problem too. They want to sell them as quiet so they disable the fans until the CPU is cooking then come on 40%. Then you wonder why you hear of people's cards dying 2 years in

I wonder how, the Fury is shit, consumes a lot and has laughable 4 GB.
I hope AMD brings a 680 next year with almost Ti performance for up to 300 €.

you need to go back

Runs fine, for me. Didn't like Witcher 3, thanks to Goyworks, I'm sure, but otherwise, it's actually been an extremely solid card. I love it.

t. Podcast goy who likes it quiet

Witcher 3 runs even bad on NVIDIA. Unless my 1080 Ti, handles 60 FPS with Hairworks. But there is no difference.

It was overpriced and I would have rather gotten a 480 8 GB.

RIP Terascale.
I still remember when the 2900XT was released. 1GB of VRAM was impossibly huge.

Attached: MSI2900XT-16.jpg (700x428, 110K)

hey that was me and im still using it !

>Rx680 that shits on the 1080ti for 300 quid
lol
there is a greater chance of me getting sucked off by every actress in Hollywood

Got a Vega 56 for 409€, watercooled it and it easily outperforms a GTX 1070 or GTX 1080 which would have been significantly more expensive.
Also I mined some imaginary coins with it and the resell value is still more than I paid for it new.
Thinking about it... now might be the time to sell it

It was a pile of shit.
>TFW HD2400 Pro until 2013
CoD4 ran in 1024x768 in 30 FPS low.
I wish I had invested 50 € into a 9600 GT or something back then.
It will happen, somewhen. I already have a 1080 Ti, but I want as much performance as possible in my HTPC too, so my BF can use it.

no it's just expensive

Attached: 828-vega-64-pcb.jpg (720x594, 235K)

Nope, HBM2 is the future of high-bandwidth needs.

GDDRx is a dying horse.

The problem is that gayming isn't exactly the most demanding application for memory bandwidth.

>BF
kys both of you

Got the Vega 64 at launch for gayming usage and I had a quality Freesync mointor on hand. Never regretted it.

Sure it isn't going to beat 1080Ti and might match it under ideal conditions.

Yes, GDDR is cheaper, it will be around for a long time still. There is 16-20 Gbit GDDR coming. 1 TB/s is possible.
Yes, but not yet, because its too expensive to produce.
Kiss your snake, no, I'll kiss and suck his ;)

GDDR6 is just about as expensive (Need extra tracings/PCB layers to reach those insane clock speed on a narrow bus)

GDDR2/GDDR5 were in the same boat as HBM when they were introduce. It just takes time for production lines to ramp and technology matures which becomes easier to fab in bulk.

they did, they literally sold every single RX580 and Vega from 2017 to early 2018

HBM2 has been such a disappointment, HBM3 is expected to be a higher quality product in regards to yields but HBM1 and HBM2 have seriously fucked AMD beyond the miner craze, AMD needs to stop fucking around and focus on performance per watt of their products so they aren't forced to move away from GDDR modules

I really hope Navi is not a failure, they are rumored to have fucked Vega's production quality for Navi thanks to Sony investment for custom chips on the PS5, but if Navi is shit then PC gaming prices are absolutely screwed for the foreseeable future

false, GDDR6 is less expensive than HBM2, it is only about 20% of a price increase on GDDR5 whereas HBM2 is far higher, iirc about 50-80% higher in price