AMD Radeon VII benchmarks leak ahead of launch

videocardz.com/79870/amd-radeon-vii-benchmarks-leak-ahead-of-launch

Attached: Screenshot_20190129-113704.jpg (1080x1013, 406K)

Other urls found in this thread:

youtube.com/watch?v=ld9WPyP-Sss
youtube.com/watch?v=jhmMVXINUdw
arthur.geneza.com/content/resolution-explained
twitter.com/SFWRedditGifs

Attached: Screenshot_20190129-113642.jpg (1080x1481, 431K)

Who cares about radeon 7 ... we're just waiting on navi ...

Attached: 1548011527456.png (802x799, 41K)

I think we're at least 8 months from navi, otherwise I don't think amd would even waste their time with the VII and rx 590

I'm waiting on Arcturis.

AMD is making too slow of progress to catch up with Nvidia right now. They should have not even made the 590 and Radeon VII

Polaris was alright, and provided a great value. Vega was a flop. Radeon VII is just 7nm Vega. Hopefully Navi will not disappoint.

If it does Nvidia will release its XTX 3000s at even HIGHER entry prices.

lmao what the fuck are they doing

I'm not mad but I hate the fact than it's a 7nm card who draw 300 watts

Wouldn't even be mad if they didn't use hbm memory and cut the cost by like 200.

Hbmeme was a mistake desu

tfw have 1070ti. Glad to know I have better than the absolute best flagship card AMD can even make.

When you stake your entire company's future on HBM and it doesn't work out.

Attached: sued.jpg (646x557, 77K)

We already know higher core clock + 1TB/s 16GB vRAM will put performance 20-30 higher than a vega 64 putting it AHEAD of the rtx 2080 in optimized games. Of course power consumption is pretty shitty but that's the price you pay for pushing the vega architecture to the limit. Overall not too bad considering it will cost less than an rtx 2080 and only use ~80W more power.

Also why is everyone bringing up synthetics when 90% of the time they have no real world semblance to actual FPS?

Don't talk about HBM if you don't understand Vega's architecture and why it needs it.

Why'd you quote every post in the thread?

Can Nvidia even go higher price-wise? Nobody is buying shit and their stock price is getting raped right now.

NOOOOOOO AMD BROS
NAVI WILL SURELY NOT BE ANOTHER REBRAND AND NOT DISAPPOINT

Attached: images.png (213x237, 9K)

my 1080ti scores over 29k. evga sc2, haven't overclocked it. quite sad for amd but it's better than nothing i guess. lmaoing at 700 dollars.

>leaked benchmark on a nvidia infested game

hmmmm

Attached: ahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh.png (396x408, 165K)

>1TB/s 16GB vRAM
You said the same things with Fury, ultra high speed memory on gaymen cards is useless

It's inhumane to make fun of amd at this point. Stop it and leave amd alone.

No one is buying radeon fucking 7 for gaming, its just not worth it unless you work on media on the same pc.

Does the VII support Cuda?

>zen 2 is a confirmed flop
>radeon 7 is worse than a gtx 1080
>navi delayed until next year

I CANT TAKE THIS ANYMORE BROS

Attached: 8797.jpg (600x522, 36K)

>No one is buying radeon fucking 7
Fixed for you. No CUDA - no work.

Who gives a shit about CUDA?

The arch aftef Arcturus will totally be worth it I swear

Attached: 2qv4x2.jpg (491x550, 62K)

Most AMD cards "do" as long as the card has support of rocm.

DOA AHAHAHAHAA

This is Nvidia's fault

Attached: 1539344873531.png (1228x1502, 1.07M)

>.05usd was deposited in your account

Sad.

This, I want a replacement for Polaris, not some ungodly >800$ video card.

a-at least w-we got 140W 4.5GHz Zen 2 to look forward to

Attached: 1536678083431.png (1300x599, 312K)

Do you ya think amd will beat nvidia when they go 7nm from 12nm?

I don't know, probably not, but I don't want to buy nvidia because their Linux drivers are raw sewage.

It will be the first chip from RTG with out Koduris taint and sabotage.

Vega and Navi are trash.

When will the waiting and suffering end bros?

Attached: 1472569081370.jpg (770x711, 102K)

AMD can't into powerful GPUs. You don't buy AMD for that. You buy AMD for mid range, where Nvidia always end up struggling.

Because everyone here thinks all these shitty synth benchmarks mean anything. Take a look at BFV, vega 64 is 90% as fast as the rtx 2080. Of course if you want to play the most half baked AAA titles coming out this year that everyone is going to test with 32X FXAA/MSAA enabled at 4K where no AA is needed, it's going to perform bad.

Cope.

This. It's not AMD's fault ... it's the developers' fault ... they're all bought by nvidia ...

Attached: 1548041468164.jpg (653x726, 99K)

Quoting a million posts ought to be a bannable offense.

AMD shills are the absolute WORST.

you have to go back

Don't take my word for it, start this video at 2:00 and you'll see vega 64 practically matches the RTX 2070.

youtube.com/watch?v=ld9WPyP-Sss

What a lot of reviewers conveniently "forget" to tell you is how Nvidias ROPs have better handling of shitty FXAA/MSAA AA yet when AA is disabled even in half of unoptimized games you see vega 64 on par with RTX 2070 if not better. Of course giving the edge to AMD by disabling AA even when the res is fucking 4K won't get them as many views.

You're slightly less retarded than a tripfag. And dont even try to make AMD look better than the cold harsh reality of their continual fuck ups in the graphics card industry which is only still alive because money has been pouring in from CPU industries as well. HBM2 is still way too expensive for /v/ and vega is a pile of sheit anyway.

AMD is a joke.

Thank god I bought Nvidia.

AA doesn't matter!

Attached: 63.jpg (1000x991, 416K)

FFXV benchmark is a well reported joke. Only scammers use it.

Because the 4 triangles per clock frontend the engineers are too lazy to fix

This, but unironically

Just wait for the magic drivers

Clown company.

Attached: 6b0a478c2bf496d059a5b68c221365f8c4fd6c92410136d1dcf8fc18eb68d9ed.jpg (300x300, 28K)

>tested in goyworks title FF15

RVII is obviously shit but it's not 1070ti level shit.

In non gimpworks titles it will be somewhere around a 2080

...

I have a Vega 64 that I got for cheap and it's just fine for everything I do. I can't imagine I would even upgrade to Navi if it was the same price, unless I start running into scenarios this card will have problems with, which I highly doubt, since I only play a couple games.

Attached: 1548721727770.jpg (383x349, 32K)

>RTX 2080 slower than Vega 7

>GTX 1070Ti faster than Vega 7

damn, nvidia cards are all over the place.

AMDs problems aren't the chips, it's the drivers. Vega was already faster in computational tasks than any 1000's series Nvidia card, but still got shit on in gaming.

AA doesn't matter, go back to playing fortnite on your xbox underage faggot

Vega 64 is a housefire. If navi have lower power consumption at the same performance, I'd switch.

As an nvidia user myself I have to half agree with the shitposting faggot. Vega for the most part has been widely underestimated and half optimized for games outside of DX12 BF1/BFV, DOOM, and some other games you've never heard of. But even with all that if you toss AA outside of the equation it's very competitive even toward high end RTX cards. However as for whether AA matters depends on the native resolution. Of course it's absolutely essential if you have a 1080p monitor but chances are if you're buying a high end GPU you're gonna have a 1440p/4K monitor to go along with it. At 1440p it's debatable whether you need it depending on the size of the display but at 4K I can honestly see no reason for it unless your display is 32" or bigger which seems ridiculous for a desktop setup imho. Whatever the case Nvidia has taken advantage among all this chaos and used it to justify the $400 "mid-range" RTX 2060 with only 6GB of vRAM which both sides can agree is a huge ripoff.

Still that leaves AMD users with the bitter fact that should they want to enable AA in games it's going to be a huge performance hit. Overall 20-40% reduced performance when going from no AA to 8X MSAA or higher.

youtube.com/watch?v=jhmMVXINUdw

3dmark 100 points lower than my 1080Ti GamingX, so not bad.
FF15, however...
I guess I'd want to see the whole ecosystem of gaymes tested.

Who will be dumb enough to buy a $700 housefire?

Attached: ea7.jpg (3840x2160, 688K)

>plebbit

The 7 refers to the IQ of anyone that buys this DOA card.

Attached: 1516306259307.png (213x237, 9K)

I'll buy this card iff it has SR-IOV.

J-just wait for drivers.
I-it will age like wine.

Blame joe macri

Also hbm is from like 2009 before Lisa

Wrong. Thats GCN.
Believe me if this thing didn't use HBM it would either be 400 watts and biffer for same shit performance, or have eveb less performance than vega 64.

>the FFXV benchmark
Why do people use this? Under no circumstance has it ever managed to reflect anything even remotely resembling reality.

Why do you need aliasing in 4k?

you've never tried 4k

he doesnt hes a fucking downy and copes

4k no AA true masterrace here

Attached: 20190120_184405.jpg (4032x3024, 3.72M)

40" 4K display: 110 PPI

32" 4K display: 137 PPI

27" 4K display: 163 PPI

21.5" 4K display: 204 PPI

>"Magazines and fine art prints are viewed from an average distance of 1 foot (30cm). At 1 arcminute, it is 89 microns or about 300 dpi/ppi. This is why magazines are printed at 300 dpi."

arthur.geneza.com/content/resolution-explained

Basically at 2 feet away from a 4K monitor you'll want a screen PPI of 150 or greater to not be able to benefit from Anti-Ailaising so 21.5-27 inch is ideal for not AA.

>Nvidia gpus beat poos in their own games

Neither have you if you need anything more than 2x AA.

Nice card.

It's a monitor with motion and shifting colors. Not a fucking fine art print for archival purposes.

$700 underpowered housefire.

You'd have to be a meticulous nitpicking downy to ever notice this lol. Not sure what you're staring at but 27 1440 is still fine with AA set to low at a mere 82 ppi. Going 28-32" 4k not only improves the density several times over but completely eliminates the need for it. You're either rich enough to afford screens like this or you're an omega faggot trying to cope.

Right which is why I said 150 PPI and not 300 PPI. Nobody sits 4 inches away from their monitor.

Right but what I'm saying is 4K screens 27" or smaller won't benefit from AA so you can leave it completely off. 1440p is a tricky resolution even at 21.5 inch screen res because that's 136 PPI which is pretty close to 150 so yes lower AA methods would also suffice but once you get to a 32 inch 1440p you het about hald the PPI which means max 8-16X AA would be more ideal.

Just buy a 43 in 4k monitor, how small are y'all desks?

These look a bit fishy to me, just look at the combined scores, way too low. I'll wait for launch and new drivers before I believe anything.

Final Fantasy 15

>Polaris is a stopgap, wait for Vega
>Vega is a stopgap wait for Vega 2
>Vega 2 is a stopgap wait for Navi
>Navi is a stopgap wait for next gen

Attached: 2ec.png (327x316, 208K)

This is why i went for the Nvidia alternative...AMD barely has made any progress on the gaming performance side of their GPU's.AND NO 3 OR 4 TITLES WHERE AMD DOMINATES DOESNT MATTER IT HAS TO PERFORM ON ALL GAMES THE SAME.

amd is probably faster in more than half the current top 50 most played games at each price point

What has AMD made to beat a 1080ti?

meanwhile a fucking r9 290 is still more than capable for 1440p
Even a fucking 970 is good enough for goyfield V on 1440p
>only good settings
who in their right mind spends >500$ to one up the slider

> >Vega is a stopgap wait for Vega 2
Nobody even expected Vega 2 to be released 6 months ago. Vega was a flop at launch, that's true, but otherwise... Still competitive.

>who in their right mind spends >500$ to one up the slider

Attached: 1548511359204.jpg (1064x698, 151K)

>Vidya game benchmarks

what about people who don't play at peasant 1440p?

Well, seeing how 4GB was barely enough for 1080p with GTA V release, I wouldn't get anything with less than 8 GBs for 4K. That leaves 2060 out of the question, unfortunately. However, a 290X/390 equivalent with 8 GB RAM, aka 480/570 8 GB, provide a very good performace for its price, 30FPS or above.
Although on low settings 4GB cards are still OK, just remembered when I tested GTA V @4K @low, got pretty good 27-40FPS in their benchmark. Not to mention that REALLY good games, like UT'99, work in 4K without a single lag @high @60FPS.

it will be the first non-GCN gpu, right?

does it even matter when less than 5% of the market buys high-end?

AMD isn't abandoning the GCN ISA. Navi and its successor are still GCN, just different iterative implementations.

Arcturus will still be GCN? I like AMD but damn, that's suicide.

Why do you think thats bad, user? Surely you're a degree carrying veteran programmer with intimate knowledge of high level and low level languages to make such a statement.
Share your expert insight.