AMDRONES BTFO

pcgamer.com/amd-radeon-vii-review/

>Unless you simply must have the fastest AMD graphics card currently available, this looks like a good generation of GPUs to skip

Attached: POO.jpg (1024x576, 46K)

Other urls found in this thread:

twitter.com/RyanSmithAT/status/1093929987835482112
nvidianews.nvidia.com/news/nvidia-ceo-jensen-huang-to-keynote-worlds-premier-ai-conference
old.reddit.com/r/BattlefieldV/comments/aomrsn/nvidia_dlss_coming_to_battlefield_v_next_week_212/
youtube.com/watch?v=gJOYbgR-1Ow
twitter.com/NSFWRedditVideo

...

The current nvidia gen is a default skip too so that’s not saying much

You can literally say the same quote about the rtx2080ti. Are you retarded, op?

I'd buy a 2060 if they didn't give it do little VRAM.

This entire generation of GPUs is shit. Waiting for navi

Funny though how many of these shills weren't so hard on nvidias lineup of overpriced trash

Nvidia loves to jew on vram

not if you want 60fps at 4k ;)

What resolution do you play at? I game at 1440p 165HZ on a 6GB 980Ti and don't run out of memory unless I start super sampling or cranking up the AA to absolute stupid levels. I'm not saying more RAM is bad, but people seem to think RAM = performance.

AMD PR here. Please stop BTFOing us, some people here at AMD are literally crying right now.

At least it dosen't die 2 months after you bought it.
And the entire gen is a pass for anyone with a few braincells left anyway.

Sure, but a 980 Ti is heading into its 4th year. It's admirable that a 3-4 year old graphics card can still perform decently at 2560x1440. Buy a 2060 is pretty much brand new, do you think in another 3-4 years 6GB VRAM would still suffice?

It's a pretty meager amount for anyone looking towards longevity.

true enough I suppose. Although to be fair I don't consider spending $350 on a GPU to be someone looking towards longevity. That's the kind of money you spend every 2 years. Or hell even every year. And yea my 980Ti is the EVGA Classified model. Non standard PCB, 14 phase VRM, LN2 BIOS, etc. At stock boost it sees 1404MHz. It gives me 75+ FPS in 99% of games, even more when the game is properly optimized. Best card I ever bought. I'll hopefully be able to last until the RTX/GTX 3xxx series.

Attached: evga-geforce-gtx-980-ti-classified-acx-20-6gb.jpg (1920x1453, 402K)

It should do fine for 2560x1440 I imagine, if you're looking for 60+ FPS and not 120+. Fuck knows when "RTX 3000" would come out though, it's not like NVIDIA has any real competition at the high-end so they can sit on their asses and properly milk their current cards like they did with Pascal.
>I don't consider spending $350 on a GPU to be someone looking towards longevity
It depends why you're buying the card and what you expect out of it. I don't think everyone buys high-end for longevity and not everyone buys mid-range thinking to change it next year or so.

I guess. I try to avoid buying mid range because it's always feeling like it's under performing less than 6 months later. And yea you're right about high end. I know people irl who buy the latest and greatest every generation because they can. I spent $900 on my Classy 980Ti because I wanted the BEST of the time along side knowing it'll last me a long time. My card is still performing right along side the rest of the "high end" cards now. (1070 to 1080 range). I'm only just starting to see a slow down because of DX12 titles. If DX12 becomes the standard with no option for DX11, I'll have to upgrade.

On linux it's the absolute best you can get with AMDGPU drivers.

Have fun with your nvidia blobs gay boi

Attached: 1548893856234.png (680x680, 76K)

>useless under performing GPU works great with an equally useless and under performing operating system
Sounds about right.

I care nothing for the trash talk itself but I'm damn glad that someone out there doesn't want to abuse their customer base by locking out people from producing their own drivers with signed firmware binaries. Fuck nvidia I want to actually own my GPU and its functionality. Just recently ditched my M3000M for a W7170M and it's been spectacular.

If nouveau had a leg to stand on I would applaud nvidia but as it stands there is only one vendor who cares enough to maintain and contribute to real drivers.

That and newer kernels are going to milk the hell out of performance as time goes on because the AMDGPU kernel devs go hard for optimization. Won't beat the high end nvidia but I bet there's a good 10-15% more performance in this thing than it's letting on.

Attached: 1549324540705.png (808x853, 749K)

the problem is that none of that matters. Sorry to be blunt, but the Linux community isn't even on the radar of hardware companies for a reason, they only represent like 2% of the market. This card will flop because it's plug and play drivers/performance are less than lack luster on Windows. I'm happy you'll enjoy your cards' full potential 2 years from now, but it's the here and now that sells cards.

I WANT AMD to be competitive here. But they keep going on with their shit GCN arch and it lets down again and again.

POO IN LOO

It represents me and my interests.
Just voicing them here and letting anyone here know that this is the best you can get and its damn beautiful and you already use or are planning on using linux.

It's kind of otherworldly to see people calling AMD 'no drivers' around here only to realize that actually only 2% of this board uses linux. They just gotta keep making cards and I'll be happy, the underdog status also makes the last gen sell for 1/2 to 1/3 the price so you can basically have it all without breaking the bank.

It's lacking in GUI solutions but at least every "good knob" you can turn is sane and on by default.

I honestly hope they can find themselves a wider market appeal as Valve cracks open a new market with steam proton and can get that dedicated following willing to pay for their high end.
The only choice nvidia would have to enter such a market and hurt AMD would be to write some FOSS drivers for once in their god damned lives. I will be beaming the day that happens as I can finally enjoy the best of the best and actually own it.

Cheers

Attached: 1549603441951.png (500x749, 161K)

God damn I am glad I got a secondhand 1080Ti. Prices are probably gonna go up even more now that it's known that AMD shat the bed too.

>be rx 560 owner
>shouldve get 570 or higher
>tfw

Attached: 1495632908929.jpg (636x466, 18K)

>"flop"
>sold out everywhere

wrong thread faggot

Attached: 1503868458681.png (213x236, 9K)

To be completely fair, AMD barely made any cards. Probably less than 2,000 of them. Not hard to sell out an item with limited availability. Reviewers, people who buy AMD blindly, etc. I'm still jaded towards AMD after having reserved a day one Fury X only to have it shit the bed just trying to run 2D desktop applications. And then its' pump fucking died.

Damn son. Don't ever touch water.

All my shit is air only, the gaymen faggots will say you run hot but you can laugh when their seals pop and flood their boxes.

What chapped my ass even further was I at the had a custom liquid loop already. I had removed my GTX 670 from the loop, re-routed tubing for CPU use only, and was only intending to run the card's stock liquid cooling for as long as it took for EK full cover blocks to launch. But after the card started giving my artifacts just trying to browse the web, I sold it. Not only sold it, but sold it for nearly double what I paid due to the mining craze. Bought a 980Ti with the proceeds.

That is exactly why I waited for Sapphire's non-X Fury, and for the price to drop down below $350. Too many potential points of failure with a factory liquid cooled card.

Well I figured it was my first major AMD GPU purchase since the Radeon 5850 I had back in the day, so might as well go balls deep. I felt cheated. I should say I'm not completely against AMD. Ryzen was a curb stomper for its price and I have no regrets being a day one 1800X owner. But I really wish they would get their shit together with their GPU division already because this is fucking embarrassing.

and how nvidia will resport to an UV radeon vii?

Attached: ov6ncgcvl5f21.png (658x619, 53K)

lies

DESU I bought one mostly for fanboy reasons. I think it's a really impressive card but the cooler really holds it back. The memory is insanely fast and 16gb is useful depending what you're doing

T

no fucking way lmao.

amd are idiots if they didnt undervolt it by default

>Unless you simply must have the fastest AMD graphics card currently available,
Funnily enough, that is exactly what I desire as part of my upgrade for a 4k setup this year. I am also heavily interested in vega 7's overclocking (and undervolting) properties as the enthusiast in me enjoys tweaking my system. Its why I've pushed my trusty 290x to its limit (and is a factor why it STILL rocks socks at 1080p).

Yes, the vega 7 is the card for me so now i'm reseasrching which 4k freesync screen is the right one for me.

>This entire generation of GPUs is shit. Waiting for navi
What are you expecting of Navi, really? It's still GCN, it'll effectively be just a 7nm Vega for mid-range with some really minor optimizations and higher clocks.. If they could do any major improvements on GCN, they'd have done so by now.

Until AMD splits its architecture like Nvidia did (Maxwell onwards in the desktop space is tuned for gaymen whereas the server chips are still designed for compute throughput, hence volta) the current limitations of GCN will remain so. Ironically Turing is looking a lot like GCN in some ways these days because compute is becoming more and more relevant for vidya as notnot only does DX12 and vulkan allow for traditionally cpu based tasks to be done on the gpu the huge pool of compute resources on modern chips is very, very fast and often untapped.

Timespy's bullshittery aside its why Turing slaughters Maxwell and Pascal as these tasks.

The samples out there for Radeon VII right now and the initial Vega cards are bad enough that having those voltages were actually the only way to hit the stock clocks. Reviewers have seem to gotten duds like Deb8auer with his sample where anything past -30 mV crashes on him. It got better with Vega over time to the point where by the time AIB designs were out, the Vega cards could be massively improved by and large via undervolting and overclocking at the same time to over 1600 Mhz, although the HBM2 memory stayed shitty if the memory was from Hynix and not Samsung. The main problem with Radeon VII is the fact AMD might not produce more cards than the initial run especially now with the fact that there is no mining boom to drive production and sales.

I don't think AMD is completely going to toss GCN out the window, they may keep the ISA largely the same and the recent patent reveal about the vector ALU and adding back in a form of VLIW instructions may get the gaming performance boost needed while keeping all the compute goodness in. Ideally though, it would be nice to have a more modular design so AMD can balance compute vs pixel processing but it's clear they are going to continue with the one size fits all approach for a while.

My R9 290 4gb will just have to keep going overclocked to fuck. Everything on the market is bullshit.

I can't believe I paid $205 for this card in 2014 and spending $300 on amd or $400 cad on a Nvidia card right now gets me something about the same as what I have already +5fps.

My 4gb vram is full on every new game now, but with a couple settings reduced gets me 75fps 1440p just fine.

Radeon Instinct literally runs linux and linux only

That's why Radeon VII launch on linux had zero hicupps and ran flawlessly

Hawaii was the last of the V8's it is a well balanced chip that Just Werks™ the more you clock it. My personaly 290x is a super leaky golden sample that will take 1200mhz core (which can be air cooled) as long as I feed it nearly 1.4v. Said 290x to this day competes against a 480/580 until I run out of vram.

Depends, 2060 is pretty good value especially if you're coming from something like a 980 or R9 290/390

There's still hope for this card.
twitter.com/RyanSmithAT/status/1093929987835482112

>this looks like a good generation of GPUs to skip

And that includes Turing.

>2060 is pretty good value
Good value relatively - for almost three years of waiting, these 5-10% more fps per dollar is a disappointment, especially considering the cucked VRAM.
>980 or R9 290/390
These cards are in the 1060 range. I wouldn't upgrade to something that cannot even beat a 1080 and is close to getting VRAM-cucked already. If you made it into 2019 with them, you can also wait until 2020 with an actual jump through 7nm.

Attached: performance-per-dollar_2560-1440.png (500x1010, 52K)

just buy 1070 Ti, it has 8GB

Then pay for a 2080 you poor nigger

The 2060 is the best value of Turing, but that's not saying much. If you don't need a better gpu just skip this Gen. Vega is and always was bad. The real questions are: Will amd navi be extremely good and will it launch anytime soon? Because if not you can actually just wait for nvidias 7nm cards.
t. 290 owner

I'm not buying a top range GPU until the memecoin phase dies a little.

It's dead for half a year now.

If you're that stingy, buy a lightly used

Why the actual FUCK did they lower the core count. Are they stupid?
What's the point of doubling the memory if you're going to lower the core count?
Nvidia is outperforming you now with 8GB GDDR6 so clearly it's not that important.
>inb4 muh enterprise
Fuck off, this is a gaming product, it should be optimized for gaming.

Had AMD managed to cram ~5000 cores in there, they would most likely have competed with the 2080 Ti.
They could, for once, have had a top tier product. That would actually have been better than Nvidia's because of the superior memory speed and capacity, making it more future proof.
But no, they had to settle for 3840 cores.
Fucking poojeet retards I swear. Yes I am this mad.

Attached: cat_freak_out.gif (300x300, 685K)

probably power usage

I'd just stay at RX560

You are stupid and should stop posting.

>What's the point of doubling the memory if you're going to lower the core count?
A combination of these being salvaged MI50 chips and the fact that vega needs well over 500gbs bandwidth to be effectively fed - its why clock for clock V56 and V64 are virtually identical because the pipeline is stalling.

>Nvidia is outperforming you now with 8GB GDDR6 so clearly it's not that important.
Who knew that a totally different architecture with totally different strnegths and weaknesses would perform differently with different memory!? Truly you are the greatest mind of your generation user.

>Had AMD managed to cram ~5000 cores in there,
The card would perform not as well as you think it would - those 64 ROPS would fucking choke more than they already do. Thats assuming of course you can even build such a huge fucking die.

>Yes I am this mad.
Mad AND retarded user.

Attached: 1546116684568.jpg (480x372, 14K)

How old is yours? I mean, 560 and 570 is almost the same price for a year...

>you can just wait for an eternity and get better hardware

WHOA

>Had AMD managed to cram ~5000
IIRC GCN shits the bed after 4096 cores.

AMD has sucked for ages. I literally don't know a single person who owns an AMD graphics card anymore or anyone considering getting one either. AMD are a fucking joke. Their GPU don't even exist in the mind of the average consumer from what I can see.

Ryzen is good though and I'll give them credit where it's due for that. My next upgrade will most likely be ryzen in 2-3 years unless Intel has a revolution of their own.

>Produce 2 000 Radeon VII in a world with over 7 000 000 000 people
>"Wow, it's sold out!"

nvidianews.nvidia.com/news/nvidia-ceo-jensen-huang-to-keynote-worlds-premier-ai-conference

THANK YOU BASED NVIDIA

>over 100W TDP
It's ok if people need muh power. But I'd just clock it to lower performance anyways.

The vega 7 committed the most heinous of crimes: Not being Nvidia. As we all know if it isn't apprived by the glorious corporate overlords Jow Forums hates it with a passion.

This post was sponsored by Corsair™ with their new RGB AIO coolers for all your cpu cooling needs.

Nah, some games takes more than 6.5GB VRAM on 1920x1080.

Then they should go back to what they did with the 295X2.
That worked.

Crossfire (and sli) are dead meme technology.

i'd buy it just to support AMD, i don't want nvidia to have a monopoly
i already have an rx 470 and it's a very good card

>it's a very good card
it's really not my dude

how so?

Waiting for navi in q4 on my $80 rx570 :^)

It's old

old.reddit.com/r/BattlefieldV/comments/aomrsn/nvidia_dlss_coming_to_battlefield_v_next_week_212/

IT'S OGRE

NVIDIA GETS 50% PERFORMANCE INCREASE IN BFV AND POODEON VII GETS UTTERLY DESTROYED IN BENCHMARKS

commie tier reasoning
just buy the best product you can.

for its time it was a very good card, i bought it near launch

Considering they released the 10 series cards in 2016 what else are you going to upgrade to? Sorry the price is higher but a nigga gotta eat

Well first of all right now is the worst time in all of history to buy a GPU. The 2080ti is many times over a worse value than the 1080ti was at launch.
I bought a vii even though I have a gsync monitor.
Buying a nvidia card right now is the same as buying an iPhone, an unrootable android device, or an impossible to repair computer.
I'll take it in the ass before I spend $800 on a mid range GPU with half the die wasted on shit I don't care about with locked power limits and voltage. Same reason I sold my 1070 and bought a 980ti.

Attached: 801e7320b018271780c43808a472e990.png (569x495, 523K)

lmao who the fuck uses antialiasing?
that would bring me further away from the 240hz refresh rate of my monitor

>card downscales and upscales back
>probably will look just as a shitty as FF

>AMD BTFO

Alright lads I'm on an old GTX670 and i5 2500k, can't really hold out much longer, I can feel old yeller is about to bark its last. So I'm thinking of just getting a new PC now (with an AMD CPU, fuck Intel's current trainwreck), throwing in a 2060 or a 1160 depending on how it turns out, and then just eventually upgrading if Navi or Nvidia's next gen don't ask $800-1000+ for a mid-high range GPU and are worth the upgrade.

Good idea?

Attached: 1564578.jpg (229x250, 8K)

Wait for Ryzen 3000 Series.

Where do you live?

A 470 is basically a 570 which is still the best bang for your buck graphics card. All graphics card released after 2016 are 350$+, so it's still part of the current gpu lineup. It can handle all AAA titles at 60fps 1080p and is pretty cheap too.

A 2060 seems reasonable and a new cpu a few months later wouldn't hurt either.

No they dont. They may allocate it but its not required

>100s of reviewers
>none of them confirming whether SR-IOV is disabled or not

Anything beyond "YOLO CRANK DA SLIDERZ 2 DA MAX!!!!111ONE" is beyond 99% of reviers.

Your use-case will vary and it you're just going to play games at 1080p even though you spent a boatload of money on a GPU instead of a XBox and Playstation which would be fine for neet man-children then it may not be for you.

To me it's really attractive.

16 GB RAM
6
G
B

R
A
M

Problem with GPU RAM is that you can't just put another stick on it if you find that you need more.

Attached: radeontop.jpg (1879x1660, 363K)

Multigpu with dx12/vulcan is a thing

not really, it is such hard to implement so common game engines don't supported.

in theory it's really cool, though, because vulkan multi-GPU doesn't require them to be of the same model or brand for that matter, if you have any two GPUs you're golden. of course, it doesn't matter until one or more of the common game engines implement it.

The reviewer is an idiot. It is an Instinct/FirePro reject. Of course it isn't going to much faster than Vega64. It is meant for GPGPU hobbyists.

Why does AMD brand it as a gayming card then?
NIGGER KYS

Attached: Untitled.jpg (1105x738, 105K)

Because tech channels will give any new gaming card free advertising.

Because it is sold to the consumer market and "Gaming" has been a mandatory word to include for products in that segment for a while now.
youtube.com/watch?v=gJOYbgR-1Ow
Here's AMD stance on it. It's literally targeted at people in the market for a compute card that also games fairly well.

I disagree. Changed from 2 1080s to 2080ti I'm very happy, selling both former cards now I paid less than half for 2080 to so it's a good deal.
260w average vs 360w, more frames quieter.

I would user but that depends how long it takes, if we really get them mid-2019, I will.
Sweden, but it makes little difference I think, everything costs 50% more but the price ratio between cards stays similar.
>new CPU
Well, if I bought a current gen Ryzen I'd be set for quite a while, no need to buy a new one a few months later, but I'll only do that if my current one breathes its last before the 3000-series.

Personally I want it as primarily a vidya card with some compute related tasks on the side. Honestly it looks to be the sort of card that if you are an enthusiast (remember: enthusiast doesn't mean having the deepest pockets, it means willing to fuck about with things in this context) it will greatly reward you, much like the older vega cards do. While AIB versions are probably never going to happen I bet if you slap some 3rd party cooling on vega 7 you can get it to do interesting things.

Who knows, it might even be a card worth watercooling given how GCN responds to voltage (personally any architecture that gives no fucks abotu voltage scaling isn't worth watercooling - which puts me off Nvidia's lineup since maxwell).

I wonder how will this card perform a month from now when the drivers are not fucking broken and third party software actually works with it, a la Afterburner.

Oh no, I'm not invoking the "fine wine" meme, or whatever it's called.
Just legit wondering, since the drivers are so fucking broken that every reviewer had some sort of issue with it.

The main problem is that they pushed this card during when they are still dealing with difficulties from releasing Adrenalin 2019 so the drivers from AMD are pretty shit right now. It seems like things are working on a basic level but a lot of reviewer are confused because they didn't tinker with Vega enough to know what you could do with undervolting and overclocking past the initial drivers so you saw many reviewers not being competent to deal with it when some random youtube reviewer was able to crank core to 1950 and undervolt and crank HBM to 1200 and get competitive 2080 performance instead of trailing behind by 14% like most people are saying. Although if the batch of Vega VII are bad enough to need those voltages to hit stock clocks and AMD is not really going to produce more than their initial 5000, then it might be true that it may not improve.

>when they are still dealing with difficulties from releasing Adrenalin 2019 so the drivers from AMD are pretty shit right now
Oh, I had no Idea that was the case, since I own nvidia myself.
Thank you for the information, user.

The drivers tend to mellow out and become stable again by around March-April which is when I usually upgrade, although I want to wait until May-June this time with Radeon VII difficulties but that would mean I miss DMC5 drivers so I might have to pull the trigger come March. But yeah, I wish Nvidia modernized their drivers like AMD does. AMD's yearly big driver improvements are welcome and nice once stable, although I wish they did less of that and more fundamental driver improvements like fixing their dogshit OpenGL implementation to match their Linux driver and making a hidden multithreading DX11 option available but it's probably off the table as AMD is continuing to push low level APIs.