Shitty color compression, terrible drivers and hbm is what keeps amd behind nvidia

Shitty color compression, terrible drivers and hbm is what keeps amd behind nvidia

Attached: small_radeon_vii_2.jpg (626x429, 56K)

Other urls found in this thread:

amazon.com/dp/B07JVQ8M3Q
newegg.com/Product/Product.aspx?Item=N82E16824022558
twitter.com/SFWRedditImages

its a stopgap gpu and they've already abandoned hbm. what's your point?

hbm on consumer cards was a mistake

True but without it they wouldn't have been able to compete as their core was too weak

It is nice card. Buy one today sirs.

>Shitty color compression
Actually AMD has better color compression and allows 10-bit output on consumer gpu not like jewvidia

>design a video card around work and gaymen
Why make your flagship GPU a jack of all trades and not just have it focus on one thing? No wonder the R7 was met with disappointment.

gcn stronk

>color compression
is this a new meme?

>Shitty color compression
no.

nvidia shills heard people complaining about colors and trying to reverse meme it on AMD, gullible nvidiots will believe them

much like the 1650 or the galaxy fold, this is an experimental card. It "proves something". You're not supposed to actually buy it. Only idiots get it.

I got no problem with hbm but everything else is awful. AMD cpu's are fucking great though, I love my 1600x.

But I'm using 10-bit color with my Vega 64 right now

Basically every meme against amd cards for a long while has been a reverse meme about actual problems Nvidia has had.
It's just a shame that the radeon vii had to end up hot, loud, and power hungry, with only beta drivers available to reviewers.

I bought the Vega 64 and I think it's great. Sapphire, three fans, never overheats in my case. I also didn't even consider Nvidia since I want to reserve the possibility to fully migrate to Linux at some point instead of just dual booting.
AMD should just keep HBM in my opinion. It has tons of potential and the biggest con is that it's expensive at the moment. If the production grows the price should drop, which then shows up in the GPU price.

problem with hbm it's useless outside AI dev
video editing,gaming, 3d work is not very reliant on it and gddr6 is enough

>problem with hbm it's useless outside AI dev
it could be sick if they used it to cram gpus with dedicated ram into places they shouldn't (eg: laptops or under the giant heatspreader of a threadripper)

>Why make your flagship GPU a jack of all trades
Money

>terrible drivers
It's not 2010 anymore, retard.

>color compression
False

>hbm
Also false, it only increases cost. You don't see performance improvements in games because GCN has a bottleneck and it's not in memory, THAT is what keeps AMD behind nvidia.

>10-bit output on consumer gpu
Except that doesn't work at all if you want to use the Chromium browser which - for some reason - can't into 10-bit. And it's the only one. Why? I don't know why. But it's kind of a problem. Perhaps I'll bother to show a screenshot.

user, why would you need to use a browser in 10bit? you realize 10bit is for freelance pros who can't afford pro cards, right?

>why would you need to use a browser
what kind of foolish question is that? accessing 4-chan for one.

pic related is firefox on the left and chromium on the right

>for freelance pros
don't care who it's supposedly for. I have a 10-bit monitor and a card that can do 10-bit. of course, using that combination means it's impossible to use chromium.

as for your really stupid question about why one would need a browser "in 10bit", what exactly do you suppose I do? restart X every time I'd like to browse the Internet and restart it again when I'm done?

Attached: chromiumfuckedup.png (800x450, 308K)

At least it had drivers. Nvidia sent the 1650 with no drivers so people couldn't review them.
Really the biggest issue is how much the integrated video decoder sucks ass. I've noticed most applications like browsers and video players just use software rendering since it's so buggy and when it does work it uses shitloads of power. Everything else is in a pretty good though.

what is the cheapest VA 27inch 1440p monitor?

This card btfos RTX2080 in price fr performance since its cheaper and offers same shit, if you live in Europe or Asia

Attached: 1557139580139.jpg (819x1024, 71K)

Chromium has lots of issues with any display that isn't super common. I remember trying it with a 144 Hz monitor and it'd only work in Windows, still output 60 on Linux.
amazon.com/dp/B07JVQ8M3Q
newegg.com/Product/Product.aspx?Item=N82E16824022558

it's about 15% cheaper here than 2080. Can confirm, wouldn't spend my money on either though.

>Shitty color compression

[citation needed]

based, anyone buying anything else than Vega56 for 300$ is a degenerate
but I just dislike how lowkey all big ''tech''tubers shill nvidia, Radeon VII is great deal compared to MEME2080

AMD is better than Nvidia at the moment. And Nvidia and AMD are both in it because, not necessarily because of Nvidia or AMD's GPUs but because I think, in order to compete, if you try to use a GPU that is over ten percent inferior that has half the performance, then you run into a very serious, serious bottleneck that doesn't look pretty on the screen because of the image itself.
So, you know, you may as well go back to the old days of, 'Oh, he's going to blow us right out here.' And so it's a very serious problem, if you look at it in simple terms, and it's going to cost you $150,000-$150,300 more than the GeForce GTX 980 Ti, then you know that you have to give it up because of performance; that's how NVIDIA has been at this.

The R7 and the 2080 are neck and neck at 4k...

But the truth is, the 1660 Ti is the only card worth buying this generation (will do 1440p). It costs less than half.

>Chromium browser
There's your issue.

>terrible drivers
Yeah, proprietary blob without DRI is much better than native drivers in kernel and Mesa.

Idiot here, AMA.

Attached: 20190506_114405-picsay.jpg (1536x2048, 1.04M)

how do you deal with the noise?

Idiot and genius here, AMA.

Idk how bad it is yet, haven't installed since I'm going to put it in a new system and my parts get here tomorrow. My v64 reference was louder than hell at full tilt, but not too bad at 40% or so. I've heard this one isn't nearly as bad, so we'll see.

Attached: 20190330_143816-picsay.jpg (2048x1536, 849K)

Shitty marketing and not paying the electronic jew is what keeps amd behind nigvidia

This right here is loudest fucking card I've ever owned. Great performer, but you absolutely must waterloop or morpheus2 it if you want to OC at all on the reference design. I just undertune it and keep it quiet, still beats the 1660ti handily.

Attached: 20190414_130804-picsay.jpg (2048x1536, 584K)

>vega56+morpheus2 or waterloop
>absolutely beats 1660ti
Does it stupid? Next you tell us a 1080 beats a 1660ti as well?

It's a 64 on a reference blower undertuned to keep fans at 28%, chill out dawg.

I own the 1660ti too and was merely saying that when noise is normalized the v64 beats it and I paid the same price for both of them. 1660ti off newegg when it launched and the v64 used off of ebay, both $310 or so. I like them both.

>I own the 1660ti too and was merely saying that when noise is normalized the v64 beats it and I paid the same price for both of them.
Oh same price, that changes things, though I question how that is possible.

MSi Gaming X 1660ti new vs. V64 reference used.

I haven't modded it yet, but I'm one of the people that always says to never buy a vega on a reference design unless you're doing a loop because of the noise. You can get the strix or nitro+ cards on eBay for around $350ish so long as it's not the weekend when the bidding closes.

The 1660ti is definitely one of my favorite cards of all time as well though and it can in fact play 1440p with lowered settings smoothly. It's very flexible and extremely efficient.

>This right here is loudest fucking card I've ever owned.
Really shows how we've advanced. I remember thinking my 680 was quiet in comparison to a 480.

I agree and I think fan tech in general has advanced a bunch. Even case cans are much more efficient and air coolers are basically on par with AIO now.

I think it's trying to communicate.

What keeps AMD behind Nvidia is awful architecture and inability to revolutionise where Nvidia is complacent. AMD was the first to put an 8GB card on their high end card. Nvidia at the time was running 3gb on theirs and it gave a reason to purchase AMD over Nvidia despite the Nvidia card being faster at the popular resolution because the AMD card could run higher resolutions and be underclocked if you wanted to save power and get similar performance.
What AMD SHOULD be aiming for with the current generation of card is to reach the 1080/2070 performance bracket while undercutting Nvidia and giving 12-16gb of GDDR6 for not much less than the 2070. That would give people wanting to run a 4k monitor a reason to buy AMD over Nvidia and not have to buy a 1080ti/2080ti to achieve it. comfortably. Might also make Nvidia stop being memory jews too.

>AMD was the first to put an 8GB card on their high end card.
Also the first with 10-bit color support. 5k series iirc?

I recently put my V64 on a morpheus 2 with two noctuas.
1700mhz core, 1140mhz hbm, 55C and using 190W while completely inaudible. Vega is great, it's just that the reference cooler is dog shit.

Attached: Screenshot_2019-05-05-14-46-15-908_com.amd.link.png (1080x2160, 908K)

use a shit browser, get shit results

The issue is that they cut off the heat sink fins to submerge the fans in them, thus the actual surface area for heat dissipation is kinda small.
Who ever allowed this should be demoted.

>hurr, why doesn't ayymd cards give me washed-out visuals for moar fps
kys blind faggot

The Radeon VII would sing if they just let Sapphire design the air cooler.
The Sapphire Vega 64 comes out of the box fairly close in performance to the liquid cooled version.

just buy RX489 for $99

Only Nvidia with it's tards is against HBM. Nvidia wishes they could afford it, but they can't even give you memory amount that are a factor of 2.

Attached: Laughing Whore.jpg (762x900, 161K)

Navi use GDDR6

>the biggest con is that it's expensive at the moment
Doesn't it also need shitton of power compared to GDDR?

HBM uses more energy and is more expensive than gddr6 so there is no point in using that.

AMD Radeon RX Vega 56 >AMD Radeon VII > Nvidia RTX 2080

Attached: 666o.jpg (1080x1082, 1.51M)

>as their core was weak
>gcn cores were weak
>cant be that stupid