Radeon VII destroys gtx 2080ti, NVIDIOTS ON SUICIDE WATCH

Attached: radeon VII.jpg (1920x1080, 186K)

Other urls found in this thread:

youtube.com/watch?v=BcL5kbLFBdA&feature=youtu.be&t=46
youtu.be/n9rhtSwAVfI
twitter.com/AnonBabble

pls delet this kind sri sir pls

Attached: aKCAW5x.jpg (796x805, 119K)

>in one game from 2017 that never got popular

Does anyone truely care when playing a game that it's at 75 fps or 98? Just lower the settings man.

the only thing that garbage destroys is your electricity bill

Attached: r1367464_19263895.jpg (700x394, 102K)

The raw computational horsepower of VEGA 7 is there to compete with the 2080 ti but 2 problems currently exist in vega:

1.) ROPs cannot efficiently handle AA methods like FXAA/MSAA. Going to 8X MSAA essentially cuts performance in half in some cases.

2.) FP32 compute utilization and efficiency was supposed to be high on vega with primitive shaders and full hw async compute. However ever since vega launch devs have been reluctant to add it because game devs must incorporate it into DX12/vulkan APIs and doesn't magically "just werk". This means optimization for thousands of shaders in the game devs jobs and must be meticulously debugged to find performance drops throughout the entire game.

Overall we COULD see vega 7 get pretty close to RTX 2080ti performance but it would rely on primitive shaders being turned on through drivers, game devs implementing them through DX12/vulkan, AND using AA like CMAA/TXAA instead of standard MSAA/FXAA.

For now using vega without AA is as good as it will get. It's also possible to receive more performance boost straight from driver updates but given how AMD managed to get 15-20% from 2017 to 2018 there's not much more juice to squeeze out.

Attached: 5% slower at 1440p.png (1920x1080, 546K)

Attached: 0a1.jpg (643x820, 80K)

This graph is not relevant

yeah because it's a generation old card and not the new one it's actually supposed to compete with

>fortnite -25%
muh gayme

Attached: nividiot kid.jpg (512x800, 218K)

Attached: amdfag.jpg (8624x3896, 2.57M)

wow, amazing!
i can't believe these arbitrary measurements, way to go [company]!
it's totally different than [last year's model], that's for sure.
please, take all my money, you deserve it more than me.
i love [company], i just don't understand how anyone could love [other company] more.
i wish i had children to share this with...

>housefire temps
>WHUIRRRRRRR
>destroyed
ok bro

Attached: 1549831423662.png (1920x1080, 494K)

...

Yup unfortunately MSAA/FXAA will always weigh VEGA down on top of inefficient use of FP32 compute. The former can easily be fixed by using better AA methods like CMAA or no AA at all but the latter is more trickier to fix without game dev support for additional DX12/vulkan optimizations.

I think someone should make a fair 4K benchmark where AA is disabled on a 2080ti and Vega 7 to better asses what the FP32 compute defficiency in vega is. Also like many anons noted here it doesn't even make sense to use AA on a 4K monitor that surpasses the 150 PPI at 2 feet away limit most people have for discerning individual pixels on a screen.

>please tune the benchmarks in a way that makes me regret my purchase less

Overall worse than 2080, but funny how it beats it on Battlefield V and Shadow of the Tomb Raider, both of the big ticket RTX/DLSS games.

It's amazing how visible jagged edges and shimmers are, even on a 1440p phone with AA off.

DLSS is disable in the bench, Vega could be worst

t. Battlefield V playing AMDrone

>he sits 2 feet or more from his 4k monitor
How are you supposed to see every blemish on your waifu's skin?

I didn't buy a vega 7 and still currently hold a gtx 980 in my rig. I'm just saying that such a benchmark would show overall how bad the FP32 compute inneficiency of vega 7 is if compared to the 2080 ti since BOTH have similar FP32 compute performance.

Going back to polaris for a bit, AMD cutting the ROPs in half from hawaii has made a long lasting damage in performance when AA is on even at 2X MSAA. Vega added the missing rops again but looks like that was not enough.

I'm talking about 4K NOT 1440p. Unless you're referring to 32"+ "monitors" you don't need AA for 4K displays unless you put your face right up against them and burn your retinas.

tl;dr
>vega is borked
>fixing it is hard but doable
>muh DX12/vulkan

>TNshit
lol dumb kid

Why are people obsessed with this abortion of a product?
It's slow, hot, and unavailable.

Go back to the "Wait for Navi" refrain. Something might actually come from that. This is a done deal. You can't buy one and you wouldn't want to. Move on.

Attached: trash.jpg (595x842, 465K)

POO IN LOO

Because it's an interesting GPU architecture that ALMOST made it but it just didn't have enough steam in the end. Maybe we appreciate things like that because it reminds of us in a way.

A lot of us ALMOST made it as programmers, admins, security analysts, techs, ect but we just ran out of gas or something and we fell through the cracks in the floor and now here we are almost like we're in technological purgatory. But deep inside we know this isn't over until we literally die.

That's how my autistic demotivated ass see's it anyway. Zen was a breath of fresh air when you consider had intel NOT released 5 GHz nuclear reactors they would have bled more than they already did. AMD did good.

>released 5 GHz nuclear reactors
how ironic considering radeon vii sounds like a fucking airport under load

GCN's inability to scale was known from the very beginning and AMD literally said it would take a lot of work to fix it, implying it's not short of overhauling the architecture.

Who cares? AMDRONES and NVIDIOTS are going to be RAPED to DEATH by the summer once Intel release their new, stand alone GPU with 32gb of GDDR... 9 memory and crypto miners are being press ganged into subsidising its r&d costs so it's going to retail at about 3.50.

Screencap this.

So does the 2080/2080Ti (when tensor cores are used) you retard. ~300W of thermal dissipation is not easy to handle especially if you want to do it with relatively little noise output.

>DIRT4
>this image
AMD fanboi here but I kek'd

>CMAA
>better than MSAA
CMAA is post AA, slightly better than FXAA but worse than post only-SMAA. MSAA is spatial and is superior overall to any post method. The only post methods that get near spatial AA have temporal components (SMAA T2x, TAA, TXAA, DLSS) and those often still have a MSAA 2x or 4x component.

How does FXAA weigh VEGA or any modern GPU down. Even the 1.3TFlop Xbone can perform FXAA or MLAA on a 4k image in a few ms. Post AA take less than 1ms on modern GPUs, the FPS of post-AA is negligible. MSAA on the other has a significant effect, though if AMD had anywhere near the compression tech of Nvidia it would significantly outperform the Turing or even Pascal because of the raw memory bandwidth advantage from HBM. AMD is still probably behind Maxwell when it comes to compression, hopefully Navi will improve things.

>4k does not need AA
This is ignorant bullshit, as someone that uses both a 5k Imac and a PC with a 4k 27" monitor I can tell you that AA is still needed. The 5k screen gets close to the point where jaggies are not noticeable most of the time, but they can still be seen in high contrast scenes. Shading artifacts, especially things like HDR highlights are still noticeable. At 5k one can probably get away with a good post or temporal method, at 4k a spatial method is needed in the AA pipeline.

Who cares, did they take SR-IOV out or what?

Attached: 1549733344495.jpg (1280x960, 62K)

Nigger retard kill yourself

Attached: Gorrila.jpg (1294x478, 69K)

I am quite perplexed as to why only 1440p results are being posted on Jow Forums - sure it is logical to also display 4k results given the horsepower of the 2080 and vega 7 which is makes both acceptable cards to use at 4k.

Attached: hmmm.png (470x454, 11K)

She's got really big boobs for a kid

why would you even use AA

WWWWWWWWWWWWWZZZZZZZZZZZZZZZ
youtube.com/watch?v=BcL5kbLFBdA&feature=youtu.be&t=46

gcn is always a fucking force to reckon with when compute shaders are in play because they can utilise the sram properly

He's probably one of those retarded faggots that believe goldplating on audio cables makes the sound quality better and swears they can tell the difference.

Yeah where the Fuck are the 4k results.

How the fuck can you not use AA? Even at 1440p you can see plenty of shimmering, and modern shader techniques make shit way worse.

NNNOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO STOP KVETCHING AND BUY RTX YOU STUPID GOY

Attached: 1436734391281.jpg (202x249, 30K)

not at native res

>pls make a benchmark that plays to AMD's strengths instead of how you would actually use the game in reality pls sirs do the needful
Why the fuck would anyone do this? People want to know what performance they actually get, they don't give a fuck if it could perhaps maybe perform much better in some stupidly specific setup they will not actually play the games with.

>7nm
>16GB HMB2
>trailing last gen (and current gen) 14nm Nvidia cards
How finished is AMD when Nvidia adopts 7nm?

whats up with all the shit AA slowing AMD cards down man.

Doesn't look like a faggot.
Just a regular american kid
And judging from the amount of fat on him, he's one of the healthier kids.

>request the resolution/settings majority of people who would buy a card that price would actually be playing at
>hurdur that's not the performance you would get from it
Try and convince me that people buying a $700 graphics card don't plan to play on a 4k screen, and then if you cant try and explain to me how hitting pixel density of your eye doesn't nullify the need of AA?

>most people don't use AA at 4k
Keep dreaming AMDrone, maybe you have shit eyes. In any case:
youtu.be/n9rhtSwAVfI
Have fun, 4k benchmarks are there, aside from some pretty good 1% framerates, it doesn't impress anyone at that price.

I'd sure hope most people aren't stupid enough to spend $700 on a card to play at 60Hz.

>$700 graphics card don't plan to play on a 4k screen
People who aren't complete cattle normalfags who understand that 1440p@120Hz>>k@60Hz?

>I have that exact same card
What a chad

Attached: 1543434568623.jpg (324x419, 41K)

Not the guy you were talking to, but you are one blind dude.

Aaaah it's such a flawed card... but I still want one. I don't even play modern games or do anything compute intensive. Am I the AMDrone everyone keeps talking about?

and i thought i was fat as a kid ... most probably shit, utter shit diet :/

Attached: 1548833945764.png (750x780, 77K)

team red? more like team fat

Attached: 1547906253954.png (420x420, 380K)

and you're a pretentious faggot

you mean team fed?

This. Just wanted to note, bigger than mine.

Performance of the jet engine housefire is basically irrelevant.

Sounds like any air-cooled ~300W video card (2080/2080Ti qualify in RTX mode).

my accelero extreme (which is probably cheaper to manufacture than that FE vapor chamber abomination with its 9999999 screws) would like to have a word with you

So finished that they're still raking in more money than in the last ~10 years

Serious question. Which reviewers do you trust? Looking and comparing benchmarks from different sites just shows discrepancies where there shouldn't be any.

>vega 64 has better minimums than 2080ti
nvidiots will defend this

Techpowerup is definitely the best.

Modern games, relative performance across various resolutions, and overall overclocking performance. One of their developers made afterburner so they provide the most realistic scenario for overclocking performance.

I also like how they take pictures of the unboxing, various pictures of the card, and disassembing it.

Stutters don't matter

Wait for better drivers and for more btfo.

>muh 20 fps ray tracing experience

Bro nobody plays Dirt, it's literally a benchmark app like 3D Mark

I play Dirt 4

Reminder that buying AMD contributs to pushing tech forward.

Yeah. There is a difference especially if you have a 144hz monitor. Also it just feels nice to have the best GPU out there.

is there a charity organisation for AMD?
I'd like to contribute the margin of one card, but i don't want to waste the world's resources on actually making the card.
Also I can contribute even more if I know I can write part of it off on tax

>got popular

Go back to twitch, zoomer.

>msaa
How many modern games even use it? Most games opt for blurry garbage nowadays.

So, like what is the point of these ultra high end cards when video games are lagging behind by about ten years?

Race to 90+ fps 4K VR. The biggest problem in VR right now is textures are too low res and pop in and out of environments as you move in them so 16GB of 1TB/s vRAM is a good start.

Because some games are unoptimised pieces of shit that barely run at medium settings on a 5 year old high end card.

But doesn't VR need like double the power because 2 eyes=2 diffenrent cameras?

Like, if your game runs at 90 fps, does this framerate transfer equally to VR?

>fortnite
xDDDDDDDDDDDDDDDDDDDD

>Going to 8X MSAA essentially cuts performance in half in some cases.
TAA exists.

So finished they'll post record Q3 and Q4 this year.

Blurry shit

Some Kickstarter VR has 3840x2160 at 120hz

Refresh rate matters more than fps in that regard

Cool. Now do the drivers.

Vertex processing is doubled (or even less with some cleverness) but pixel shading is still just dependent on the total resolution. With foveated rendering it's even less.

The correct choice is FXAA. Best performance to looks ratio.

>even more blurrier shit
I'd rather not have AA at all and performs better

It's a Navi stopgap compute card. Not really a reliable option for gaming (and it doesn't even exist)

It's not even Navi.

I know lol

>"$499 product sold as $699"
- Guru3D.

What is GCN and why is that bad?

FXAA is the antialiasing version of me taking off my prescription glasses for myopia.