WHERE'S THE FUCKING LEAKS

WHERE'S THE FUCKING LEAKS

Attached: AMD-Radeon-Vega-VII.jpg (800x450, 32K)

Right... HERE! *dabs*

who cares, that card sucks

How will HBM cards age? Will the huge bandwith help with good aging?

>Worlds first 7nm GPU.

This is a fucking laughable thing to market. Why the fuck should the end consumer care about the fab size of the GPU?

I hope that silver cooler is the new AMD stock cooler for all cards, that thing is slick

It's not wether they should, but that they do.
Fanboys will shout numbers at eachother 'til they go blue in the face.

Vega is a mediocre failure. Just Waitâ„¢ for Navi instead.

I'VE BEEN WAITING SINCE POLARIS

Attached: 1465718085409.jpg (800x484, 295K)

>tfw realized chinks don't leak benchmarks anymore

Waiting for decent cards that are 3-4 years old to dip below $200. Stubborn manufacturers are still in denial that their precious cards arent in demand since the crypto crash.

AMD sold every single desktop vega GPU they made for almost a solid year, Vega is used in Apple products, it's used in AMD APUs, and most importantly it's used in high margin data center radeon instinct cards. I'd hardly call it a failure. We just need to accept the fact AMD no longer prioritizes enthusiast tier consumer GPUs anymore and enthusiasts are going to get wallet raped by Nvidia because of it.

>most importantly it's used in high margin data center radeon instinct cards

Stop lie on internet

you can and should learn English on internet

What leaks? Its overclocked Vega64. There is no architecture change aside from the additional memory PHY. ROP count is still the same, some outlets mistakenly reported it as 128, but its still just 64.
It performs identical to a highly overclocked water cooled Vega64, and AMD already showed its performance at CES.

Attached: 1547151426775.jpg (701x861, 119K)

I would but I only start at AMD in a couple months

It IS used in radeon instinct cards, how do you not know this?

you wont see much of a difference at anything below 4k
if this card was out when mainstream 8k was a thing it would have destroyed pretty much everything

the card is so uninteresting nobody's bothered

Why is AMD so shit with gpus

No

die size isn't just a number, just like benchmark score isn't just a number, why should end consumer care about core count, bench scores, fps in certain games when you can just buy top of the line card right?

Unfortunately it has come to a point where the only viable option for a decent computer is an amd processor with an nvidia GPU, as AMD is shit at making GPUs and Intel is shit at making CPUs

Being able to run at a buttery smooth 20fps instead of 15, yeah.

What is RX 570

lmao this

>Nvidia GPUs hit 2GHz on 16nm and 12nm process
>AYYMD HOUSEFIRES GPU can only barely hit 1.8GHz on 7nm

It's gonna be a horrible slaughter once Nvidia releases 7nm GPUs

Why dont these niggercunts keep selling older cards but for cheaper? Like you can buy older iphones officially for cheaper. +++++++++++++++++++++++++++

its gonna fail in sales but it is technically the superior card

>core strong enough to match 2080
>when voltage is tuned and its overclocked it will be decently faster than the 2080
>insane bandwidth
>HBM has lower signal latency than GDDR6 so the frame render curve will be much tighter than on 2080 and 2080ti

And most importantly of all due to the different rendering method on AMD cards combined with less aggressive color+texture compression the visual picture quality, texture sharpness and color fidelity will be better on this vega7 than on the RTX cards. This would mean much if the core was too weak to do 4K ultra 60fps but its apparent based on the stock performance than a properly tuned and OC'ed vega will do 4K ultra at 60 fps