Muh 1 fps advantage

>muh 1 fps advantage

Attached: RadeonVII-Gaming-1.jpg (1200x675, 98K)

Other urls found in this thread:

newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709 601321570
twitter.com/SFWRedditGifs

AMD produce real quality images

>no 2080 ti

Yeah, that was retarded, but they still destroyed Novideo in Vulkan

>4k doesn't matter
>fps doesn't always matter
>visual quality doesn't matter
>vram doesn't matter
>only aliasing memes matter

>no doom or wolf 2 vulkan benchmarks
>literal who game
kek
Can't wait for 3rd party behcnmarks.

>2080 ti

Attached: XD.webm (564x398, 852K)

>visual quality
AMDshills are now picking up audiophool memes, lmao.

at least they can do bar graphs right..

instead of the usual bullshit of one bar being 50% larger than the other bar with 2 fps difference.

Just because you're too blind of a faggot to not notice the washed out mess that is dlss, doesn't mean other people cannot see the difference. A scene rendered at 4k with no aliasing looks objectively better than a meme-ai render with buttload of aliasing.

delit nao!

Attached: nvidiot.jpg (796x805, 119K)

Try having a non-shit display.

>says the nvidiafag that cannot tell the difference between (((upscaled))) washed out images from properly rendered images
You're also probably the type of autist that boasts about playing @ 1400p 120fps on a fucking 1080p 60hz monitor.

>OVERSAMPLING BAD

>2080ti
>it costs $1200-$1500
time to crossfire two Radeon VII

*housefire

ICONIC 2018 footage

>implying turing isnt a housefire

Not his fault that Nvidia GPUs can't dither to lower color depths properly.

OK

Turing isn't 300W you dumb ass

Not the other user but nobody talks about the 2080ti needing a triple PCI slot cooler because the chip has a tdp of around 350W or more when under load, and a double cooler would set the card on fire
The power of 14nm everyone

novideots on suicide watch XD

but nvidia shills said only amd cards are housefires

>300W Radeon 7 can't even beat 210W 2080
Keep coping though

Not with the triple cooler. You got Goy'd. The 9900k is a 95w TDP chip, TDP can mean anything

it's still better than 2080, what's your point?

>fake news I pulled out of my ass means AMD is beating Nvidia
How typical of AMDrones

That's Vega64 that's confirmed 300watts when overvolted and over clocked you tardmonkey

>NUMBERS DOESNT MATTER
New AMDrone taking point everyone! Literally ignoring that Radeon VII burns 550W under load.

No it's not. AMD's own benchmark shows that it's no better than 2080.

>2x8 power connector
>t-totally not 300W guys!!!!!
You red commie teams are getting pathetic now.

Imagine giving a shit about pc gaming in 2k19

Attached: 1547202881978.gif (200x200, 1.33M)

Attached: kek.png (1302x1301, 94K)

>caring about wolf 2 and doom when serious sam vulkan exists

Fucking normalfags I swear

>NUMBERS DOESNT MATTER
just like the 9900k is a 95W part, right? top kek shill

you do know a 6 pin is rated at 75W?

6+8 pin = 225W
8+8 pin = 300W
so it can be anywhere between 226 and 300W

Attached: intlel-95W.png (1112x833, 74K)

literally what? user I think you need to take a break from the shilling, its starting to get to you

You're truly retarded if you believe performance numbers presented by a company at a keynote without context or any other information.

I'll wait for the actual reviews before I believe any of this shit.

Cope more, shill
Pooga consumes more than a 2080Ti while having 70% worse perf lmao

Attached: power_peak.png (500x970, 53K)

Yeah, can't wait until AMD garbage gets BTFO once again for delivering 30% less performance than announced in the keynote.

>303W
Holy fuck

Attached: 1305371550291.jpg (450x450, 28K)

t. mad nvidiot

Why was it retarded? Lisa Su said on stage it MATCHED, not that it was ahead. Did you not watch the video? If their tests gave them a 1 FPS advantage, do you expect them to lie about it?

But that's a different card with a different chip and a different node size.

1TB/s of memory bandwidth card can only compete with a 448GB/s GDDR6 card

TOP KEK, AYYMDPOORFAGS

Attached: 1502751985071.png (920x900, 524K)

By the way, what is it with AMD products being so memory starved? They really can't into memory controllers, is that it? Ryzen is starved, GPUs are starved...

old GCN shit arch on GPU side.
Infinity Fabric on CPU side.

>strange brigade
literally who
>vulkan
lmoa

Look you can play Artifact for free

Attached: artifact.jpg (1200x630, 158K)

>its ok when nvidia does it

Attached: hw_4.gif (300x100, 296K)

At least nvidia cards have a decent hardware encoder that can be used for streaming. AMD's encoder is horrible. The encoder of the GTX 2000 series is actually on par with x264 fast

LEATHERMAN BAD
ASIAN MILF GOOD

Attached: 1497642804318.png (802x799, 49K)

Welcome to the AMD CPU/NVidia GPU year.

At least when Nvidia makes questionable marketing benchmarks they show big improvement.

What's the actual performance of this thing going to be when the marketing benchmarks just show it being equal to a 2080?

These benchmarks are utter bullshit. Reviewers set power target to 150% and the card just sucks up the watts.
I have mine overclocked to 1700Mhz core and 1050Mhz HBM2 at 115% power target and peak usage is 255W, none of that 300W shit. With more conservative clocks, or stock clocks, it peaks at 220W.

>What's the actual performance of this thing going to be when the marketing benchmarks just show it being equal to a 2080?

That its 100 bucks cheaper?
That AMD is very good at the "fine wine" approach and will only get better?
That it doesn't have AI hardware running on it you cant access or see what it's doing?

>what is silicon lottery
>That its 100 bucks cheaper?
2080 is $700
>That AMD is very good at the "fine wine" approach and will only get better?
Nice meme
>That it doesn't have AI hardware running on it you cant access or see what it's doing?
Only autists care about such things

No, it's not silicon lottery. All reviewers say it, they set the power target to 150% in their reviews. You can get the exact same clocks at even 110% power target.
You don't need to draw 300W on a V64 at any time for any reason.

>2080 is $700
newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709 601321570
According to Newegg out of the 12 best selling cards, only 3 are at or "close" to 700 bucks, the rest are 800+
Though AMD might be in a similar spot with how their partners price their cards, so this is a TBD spot.

>That AMD is very good at the "fine wine" approach and will only get better?
Nice meme
Fine wine is what AMD does, and given how they are committed to the vega platform and finally have the resources and reason to really capitalize on it this will come to be, whereas RTX platform is good, but I can see it being ditched once the raytracing meme dies out or more than likely is implemented via software instead of nvidia proprietary hardware.

>That it doesn't have AI hardware running on it you cant access or see what it's doing?
Only autists care about such things

Still another bad point, and given how security focused the world has come through the shocks of hacks, spectre, etc is another bad point that can exist purely on Nvidias lack of open sourcing, and paranoia.

See

>being this assmad

>shoe is on the other foot
Are you learning how fucking annoying it is, oven food?

imagine

>1 fps advantage
>100$ less
well, seems like amd won this generation

explain slowly what am I looking at

Wow. You're fucking bafflingly retarded. Please sterilize yourself.

It's not too far from that in the 2080Ti version, it is more power hungry than the 1080Ti was.

You just don't get the level of detail you're witnessing. That's the new exclusive ray tracing desktop experience, stop talking shit about the stuff you just don't understand.