AYYYYYYYMD

AYYYYYYYYYYYYYYYYYYYYYYYY

Attached: jYwtoVZKkqiY6Smy.jpg (500x410, 61K)

Other urls found in this thread:

gpu.userbenchmark.com/
wccftech.com/exclusive-mike-rayfield-amd-retires/
gamersnexus.net/game-bench/2652-battlefield-1-graphics-card-benchmark-dx11-vs-dx12,
twitter.com/NSFWRedditGif

>buying bioware games after DA2
yikes

>runs worse than any other Frostbite title
Bioware does it again.

>no 290X
eh.

>290€ 24fps
>380€ 30fps
>400€ 33fps
>500€ 38fps
>390€ 39fps
>550€ 46fps
>750€ 57fps
>1300€ 72fps

I sure hate the future of pc gaming.

>A meme resolution for meme people

Where's Radeon VII? That's the current flagship model.

Buying/playing bloatware game

>GTX 1070 30 fps
>Vega 56 not far ahead
>Vega 64 38 fps
>no GTX 1080 in the mix but fair to assume it's in the same ballpark based on these numbers given its similarity to the 1070

So basically it runs like shit on anything other than the latest overpriced RTX hunks of shit. Oh and look, it's an Nvidia-sponsored title, who could have guessed it?

inb4 they turn RTX on and all those cards drop to 20-30fps

not released yet retard

Did you know that 590 is close to 1060? This chart adds no surprising data

>4K isn't a meme

>buy 4k card with power and cost dedicated to RTX cores
>have to run at 1080p to use RTX

580 is close to the 1060, 590 is around 10% more powerful.

now post the price lmaoo

It's a bit disingenuous to not how the performance of undervolted and overclocked Vega 56/64, since it would be totally irrational to buy one of those cards and not hand tune it's settings.

>4k gaming
oh, I was almost worried graphics in games are becoming better and more demanding

>Recently bought a 1070
Should I sell it for a 2060? I can probably make my 300€ back.

So another Gimpworks game? What specific nVidia bloatware are they using?

Isn't Anthem just a remake of Destiny? I watched the trailers and I completely failed to see the difference.

That was how I felt also. I got a free copy of Destiny 2 and it was ok. Watching anthem I couldn't get past how much it was like Destiny. The zoning, the traveling, the gun play everything.

>an nvidia infested game performs far worse on amd

call me shocked

The game engine is Frostbite 3, which has great performance on all cards. However I feel this time, there's some extra things in the background thats changing the performance. Most likely nVidia middleware inserted here.

most of the compute AA is gone and now its back on the graphics pipeline because of the nvidia stuff
frostbite traditionally runs great in amd but since bfv you can see this isnt the case anymore..

It's Destiny with shooting that feels worse and bland art direction.

Attached: 34565654.png (1064x698, 297K)

>GTX 1070
>30 fps

Sasuga. Who the fuck is supposed to play this shit?

Attached: [DeadFish] Aria the Origination - 13 [DVD][480p][AAC].mp4_snapshot_23.53_[2016.01.15_03.13.16].png (848x480, 824K)

Then wait a week or two till it be released and then make your stupid benchmarks on your childish vidya gaymes.
comparing Vega to nvidias' newest GPUs is literally retard tier
Kill yourself, mongoloid, childbaby

At 4k? nobody. 4k gaming is a pipe dream.

Not so bad, looks like I'd get ~80FPS with my card, assuming that's all a GPU bottleneck which it does seem to be. I was honestly expecting worse from an AAA game, Ubishit's latest AC or whatever the fuck it was runs way worse and doesn't seem to really look any better. This runs better at 4K than that shit does at 1440p.

They're showing stock settings, you can tweak/OC pretty much any card to get better performance.

Did you actually buy a 1070 and expect to get 60FPS at 4K? You wouldn't even get that in 4 year old games like Witcher 3 which was around when the 1070 launched, how the fuck do you expect it to max out a 2019 game at 4K 60FPS? 1070 does fine at 1080p, it's nearly 3 years old.

Attached: rnYgzX8MEkTNCMTm.jpg (500x410, 67K)

Frostbite and BioWare in particular were one of the first ones to use Mantle (AMD's precursor to Vulkan).

How they have fallen to NGreedia.

I think they intentionally make their games run shitty so people will upgrade their hardware.

>1070 does fine at 1080p
works on my 1440p 60Hz quite alright on High (since 'Ultra' is a scam)

1360x768 Master Race

Many of the assets are reused from Andromeda and there is some obvious "inspiration" taken from warframe.

>resolutions over 480p
What are you a gAyMD user? Just buy intel and get your 700 frames.

why the hell anyone plays at 4k is a wonder to me when the GPU requirements are still not met for most budgets, and the monitors available lack high refreshrates or low color switch times, even at far higher prices because the controllers just arent around yet. Even Quad HD doesnt have these problems

>'Ultra' is a scam)
Biggest cope.

>J-JUST WAIT

Huh... 2060 is actually the best FPS/$ in that list.

>They're showing stock settings, you can tweak/OC pretty much any card to get better performance.
Yeah, except that Vega proportionally benefits more from tweaking than the RTX 2060 will.

The nvidia bloatware called AMD's shit drivers with high CPU overhead

GIMPWORKS TITLE KYS

GIMPWOEKS TITLE KYS

GIMPWORKS TITLE KYS

Anthem is dead on arrival anyway, just like battlefield v because of gimpworks that causes poor performance

kys manbaby. It's AMD's single threaded DX11 drivers rearing their ugly head again. Their drivers are STILL shit for anything other than DX12 and Vulkan. DX11 and below have high overhead and run like shit whenever a game gets CPU bound. Their OpenGL implementation is famously just straight up broken and the emulation community has had to put up with that shit for over a decade now. AMD is indefensible and you buy their broken shit because... why? You get what you pay for, and less even.

>Anthem

>it's n-not amd's f-fault! it's the d-devs!

>EA

LOOK AT MY HOMOSEXUAL EA GAMES YES NVIDIA KEK ME MORE

>1399.99USD+ card can barely do 70fps
The fuck?
This how the fuck is a card almost 10x cheaper getting fps 50% as good? Something is beyond fucked here.
Gpus have stagnated so much since 2010 it's not funny.
Amd and nvidia need a swift kick up the arse

1080ti isn't even mentioned. well fuck.

Guess ill stick to 1080p then.

Only if you're an idiot falling for the 4k meme

Comfy 1080p 60FPS is cheap as fuck

The generational difference between the 1000 and 2000 series is too big. The 2060 is not 30% faster than the 1070. Definitely some Nvidia shady practices going on here. They've started gimping the 1000 series.

Just upgrade your 780ti goy.

>Bioware

gpu.userbenchmark.com/

This site has it's own independent bench for 3D performance and it gives the most accurate raw power ranking.

If you see game benchmarks that differ vastly from this ranking it's almost certainly due to anti-consumer practices.

>4k meme
>unfinished game
>no driver optimization

Attached: 1509419439254.gif (320x240, 2.34M)

580 kills the 1060 in most titles

You can tweak a stock 2080 Ti to get an extra 15-20% out of it too, you don't see anyone bitching about that. Testing stock makes most sense because those are the guaranteed settings every owner of the card will get, plus not all owners will even bother to tweak and/or OC in the first place so if they only displayed tweaked cards it may not accurately represent the performance those people get.

If you have tweaked your card then you should have a rough estimate of the performance increase you get. Like if a stock 2080 Ti gets 72FPS, a tweaked and OC'd 2080 Ti with a higher power BIOS will get like 83-87FPS. It's not going to be perfectly accurate but it should give a general indication. If you have an estimate of the real-world performance boost your Vega gets from your tweaks you can guess what performance will be like on your system based on those stock Vega numbers.

V I P
D
E
M
O

I've been gaming in 4k for years, 1080 sli and now 1080ti handles it just fine, you don't have to play in ultra settings you know, usually turning down like 1 or 2 setting can improve performance by huge margin with almost no visual impact.

Yeah, the game is finished.
They're just testing the servers.

>building hardware according to spec
>somehow bad
go fuck yourself retard

who's gonna buy this? Ugly graphics, low fps, seriously. PC need more optimized game, a lot game engine already good enough to 60fps 1080p.

>Vega VII 25-30% stronger than Vega 64
>Would still be 10 fps behind a 2080

OH NONONONO

Gimpworks

except on linux

Windows kiddies stay mad, running CEMU in Linux under wine performs better than Windows on AMD, Windows is just trash and even AMD knows it

Ah ok so this is just reskinned crysis 2, v64 should be close to the 2070, not performing worse than the 2060.

>2060 killing ALL of AMD's gpus

kek, SHITMD

Fuck you! Im waiting on Navi ..

Attached: ahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh.png (396x408, 165K)

It's literally around 2070 level because it sucks at compute/shadier heavy stuff
Wish nvidia would sto fucking around with this rtx garbage and give us a high end gtx 1180ti on 7nm turing
It would btfo rt core shit and actually be good at raster faster core clocks
Chuck in some better faster memory like 16gb gddr6 and more bandwidth with higher clocks and you'd have a perfect 4k 100hz+ card
But no it's either rtx junk or nothing
Midrange-high end is overpriced shit nobody wants to buy and has gimp works shoved into the fucking card at the silicone level.
Sick of waiting for amd Arcturus for infinity and navi is barely gonna hit gtx 2070 perf

Why no Navi data? I thought it launched at CES?

Attached: 1548011527456.png (802x799, 41K)

Nothing AMD does is to spec. Bog standard, agnostic openGL extensions that Nvidia and even fucking Intel implemented are still broken on AMD's drivers, which is why there are a slew of popular games that still break on pcsx2 if you use a radeon card.

i wonder why this W1zzard guy hasn't benchmarked Resident evil 2 which is actually released, hmm

They literally said this would be the case in the CES amd presentation lol what's your point? Radeon 7 is literally tech that was rushed and forced by AMD after an already fired executive trumped it up and said it would be amazing, but they had the product and there wasnt really any reason to not release it, which is why Su was so fucking visibly nervous at CES.

wccftech.com/exclusive-mike-rayfield-amd-retires/

>(((Ultra))) settings

Attached: 1538800763740.jpg (1274x720, 97K)

>just use DLSS goy!

Why are people saying this scales poorly at high resolutions? This looks like it could be a Witcher 3 benchmark

>extra 15-20%
No overclock without some heavy mod, power table and tweaked bios gets this kind of performance. There's no 20% do be gained in any card, what the FUCK are you smoking? 20% clockspeed =/= performance.

>spends 3000 on his PC
>can't max games

Attached: 1510175474597.jpg (720x531, 56K)

It really is, until NVIDIA drops the non ray-tracing 1160 and completely removes any reason for the 2060 to exist beyond milking dollars out of idiots.

>290€ 24fps 12.08€/fps
>380€ 30fps 12.67€/fps
>400€ 33fps 12.12€/fps
>500€ 38fps 13.16€/fps
>390€ 39fps 10.00€/fps
>550€ 46fps 11.95€/fps
>750€ 57fps 13.16€/fps
>1300€ 72fps 17.80€/fps

Turns out that apart from the meme flagship card performance value actually scales pretty evenly in graphics cards, unlike with CPUs where you have to double the price for a 30% increase in performance.

>390€ 39fps 10.00€/fps
Too bad this number will shit itself in one or two years when the 6GB of VRAM will hold the card back.

>spending $1200+ for 60fps
This is your brain on NVIDIA

Ultra is just High detail with optimizations and smart LOD bias disabled. It renders shit that you can't see anyway. It requires a thorough screenshot dissection to see any difference.

>proud 1070cuck
Fuck bros, it hurts so bad to see the 4k but only reach 30fps, it's so noticeable

When I get a 4k screen (soon™) the first game i'm going to play on it will be need for speed II - a game from 1997 just because I fucking can.

Is it even going to upscale to 1920x1080, much less 4K?

Due to DGVoodoo it does - I run it at 1080p right now.

Attached: Gavity 1 Ferrari 0.webm (720x405, 2.87M)

damn what a beautiful game, they don't make em like that anymore *sips*
unironically games in the past 10-15 years aren't even worth playing

This right here is the sign of an incapable developer that prioritizes optimizing for consoles and porting up to PC instead of optimizing for PC and porting down to consoles.

BF1 and BFV, if you ignore the raytracing bullshit on the latter title and just work with a standard rasterization context, both run INCREDIBLY well on all platforms on a variety of GPUs from both GPU platforms.

BF1: gamersnexus.net/game-bench/2652-battlefield-1-graphics-card-benchmark-dx11-vs-dx12, at 4K with DX11 gets ~69.0fps on a GTX 1080. Anthem comparatively gets...46fps. That's a performance delta of 33% on the same hardware.

That's fucking HUGE. BioWare, whatever they've become of a studio, still doesn't fucking understand that the best way to make a game that runs everywhere as best as it possibly can, is to make it for PC, optimize it for PC, then port the medium-high settings build down to console and start tweaking settings until you get your desired visual/framerate target and lock it in.

It's what DICE does with BF1, BFV and BFI/II. It's what iD did with Doom 2016 and are doing with Doom Eternal. Every studio that does this, ends up with a product that's universally panned as excellent; and well here we are. These faggots are fucking retarded.

You can meme all you like but it remains a good game but its flaws are entirely down to its age rather than inherent poor game design.

DM me I'll show you cheap Rx 590 with 3 free games

>panned

panned means bad. you can't pan something as good.

Panned means badly received you idiot. And BF1 and BFV got shit on despite good performance.

And DOOM is a joke of a game technically. It runs well because every area is the size of a COD MW2 map. They're fucking tiny shitboxes. I don't even understand why people were impressed by a mediocre looking with shitall going on in the environment. Yeah, no fucking duh it runs well, you have like a dozen AI in your immediate, closed area at a time. What is this? 2001?

iirc nuDOOM has monster limit per room, around 12 I think?

I have a 2060, and it seems to be worth the money I paid. I only play fortnite in ultra at 4k, and I get between 35 to 40 fps.

Right, received; not panned. My bad. But you get the idea of where I was going with it.

>CoD MW2 map

It's amazing how badly you missed the fucking point of that reference.