AAAAAAAAAHAHAHAHAHAHAHAHAHAHAHAHAHAHХAХAХAХA*inhales painfully*ХAХAХAХAХJAJAJAJAJAJAJAJAAAAAAHHHHHХХХХ~!!!!!111

AAAAAAAAAHAHAHAHAHAHAHAHAHAHAHAHAHAHХAХAХAХA*inhales painfully*ХAХAХAХAХJAJAJAJAJAJAJAJAAAAAAHHHHHХХХХ~!!!!!111

Attached: GIMPWORSE VS FINEWINE.jpg (744x736, 165K)

Other urls found in this thread:

tomshardware.com/reviews/amd-nvidia-driver-updates-performance-tested,5707.html
hardocp.com/news/2018/11/14/evga_2080_ti_xc_catches_fire_in_spectacular_fashion
twitter.com/NSFWRedditGif

>AMD ages better

We know this since 6850 days.

go nvidia
>buying brand new current gen
go amd
>looking at second hand older gen hadware
AMD ages like fine wine, but they are almost useless while you wait for them to age.

wait how does this work

i thought vegas were more for industrial workloads not gaming ones

Attached: 1542174962395.jpg (549x735, 78K)

D-Don't worry, drivers will fix it...!

Attached: Nvidia logo.gif (600x338, 3.41M)

Not really that surprising desu, vega has been severely underutilized by devs for a while now. Dirt 4 really cranks out that FP32. However the crux of the matter is vega can't handle AA very well, 8x msaa basically cuts frame rates in half. Whether that is due to dev incompetence or an architectural flaw is another matter.

Hopefully navi gets all these things sorted out.

Attached: Comb23082017023226.jpg (1280x1440, 255K)

AMD doesn't really have a target workload. You just wait until the gpus are a few years old, and buy them second hand instead of buying nvidia second hand. Everyone knows this.

>wait how does this work
The usage of goytracing brought quite a lot of PP effects to the compute path where Radeons literally has no competition

Attached: Nvidia_logo.webm (600x338, 226K)

>Posting fake benchmarks again

Please, you AYYMD HOUSEFIRES GARBAGE is always last in performance & power efficiency

Attached: 1440.png (500x690, 49K)

NOW, FAST, ENABLE RAPID PACKED MATH AND ASYNC SHADERS TOO! SHOW 'EM, BOYZ!

>ALL ACCORDING TO KEIKAKU

Attached: PAJEET KNEW.jpg (1143x1600, 562K)

>OC3D
>Fake
You need to try way harder than just that, Huang.

>driver updates gimp older nvidiot's cards and finewine pushes fps further on AMD cards
No fucking way man.

This isn't because of any of that.
It's just that the new BFV is compute heavy.

Over time, games always become more and more compute heavy compared to raster heavy. Radeon gives you more compute for the money each generation.
Nvidia always makes their cards to have barely sufficient, arguably "balanced", compute to raster for the current releases of games only.

>$400 Vega 64 matches the $600 2070
Your point is?
And don't try to say that 2070 is $500. Those are failed yield ones which perform worse, and isn't the one that's used there.
Don't try to say that Vega64 costs more either. The AIB cards do, which are better than their reference model used there. Vega64 reference has been on sale for $400 for a while.

Nvidias cars have been gimped arch wise since fermi for gaming
Amd played the Long game and kept Gcn for 7 years
Basically amd card's are much better at dx11/12/vulkan asynchronous compute (gpu multi-tasking/scheduling) than nvidias due to their computational (workstation) design
Also devs exclusively code for Gcn first (xboner and ps4) first so there is that
Doom and wolf 2 have a 64 shitting near a 1080ti

>vega 64
>$400
You have to keep your amd shilling on /pcbg/

AMD GPU's are compute heavy. Becuase Nvidias new Turing architecture is also compute heavy AMD's GPU's benefit from it as well. So well in fact that if a game makes use of heavy computer (async computer, primitive shaders etc) it thrashes Nvidia's Pascal GPU's (excepting the GTX 1080 Ti but that's a special snowflake).

Most other benchmarks disagree.

Attached: BF5.jpg (2560x1440, 355K)

>dx11
oh boy. what do the DX12 benchmarks look like.
Nvidia is gonna be in tears.

ruh-roh

BRGWAAAAAJAJAJAJAJAJAJHAJAJAJAJAJAJHAJHAJHAJAJHAJHAJHAJHAJAJHAJHAJHAJHAJAJAHJAHJAHAAAAAHHH~!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!11111111111111111ONEONEONEONEONEONE

No it doesnt. They disproved fine wine. There are occasionally one or two games where AMD works better than usual, like doom, but still punches below its die size and power use

Except the 1080ti because the 1080ti is actually the same size as the Vega and has more compute thanks to the Pascal boost. Vega 10 is like 2x the size of the gp104 or whatever the 1080 uses. It probably costs AMD 4x more per vega die than a 1080

>power draw meme
in 1000+1000+18

GTX 1080 is as fast as Pooga 64 in that benchmark, you know

Clearly OC3D is fake news, slower than Pooga 56 when every other benchmark websites shows Pooga 56 is slower than GTX 1080

This. Also: tomshardware.com/reviews/amd-nvidia-driver-updates-performance-tested,5707.html

>price of compared components doesn't matter
ok, sweetie

>>/vg/

Its been known for awhile now. Nvidia will eventually gimp the Pascal cards unfortunately. Since I have a 1070 Ti, its performance will probably stagger and underperform when benchmarking with the Vega cards, due to the drivers.

Games devs are now fully optimizing for DX12/Vulkan async compute features? Before we knew nVidia had their balls and they couldn't do anything to give any advantage to AMD.

Then they don't understand fine wine. BF5 and Dir Rally proves this, More and more games are moving towards compute workloads. Why AMD GPU's thrive on. AMD has been playing the long game. The trouble is not Nvidia but developers being lazy fat cunts to change their game engines to use the newer hardware (more threads, making use of shaders etc).

> Vega 56 is tearing apart 1070Ti
> 580 is on 980 level
Nvijew is on it again.

I am sure they will use every dirty trick in the book (cheat on visuals that are barely perceptable but make a big performance impact like they have done before) in their next driver up to beat AMD.

So the "not-fake" benchmark shows Vega 56 beating 1070 Ti and being close to 1080?
I'm gonna buy me a Predator Helios 500

>Vega 56 costs 400$, non-reference
>GTX 1080 Ti costs 770$, non-reference
>non-Ti 1080 is 460$ for cheapest non-reference
>pic related
OH NONONONONONONONONO

Attached: 674858568.jpg (1265x1416, 655K)

Attached: 1080.png (500x1170, 85K)

HHHHHHHHHHHAHAHAHAHAHAHAHAHAHAHAHHAHAHAHAHAHAHHAHAHAHHAHHAHAHHAHHAHAHAHHAHHHHAHAHAHAHAHHA

Attached: 1529525393761.png (904x401, 646K)

that's really quite embarrassing for nvidia. that and the fact with goytracing off its only a ~~20fps difference between the 2080 ti goy edition and 1080 ti but a $600-$700 price difference between the two.

whoever buys RTX is a fucking idiot and the sole reason why people want communism is because of idiots who blow money on worthless things like that.

>2080-FUCKING-Ti
>Losing to RX 570
Holy. Fucking. Shit.

>steve "amd unboxed" walton
>brb slapping a waterblock on the amd cards and OC it to the limit while benching against budget tier nvidia cards.

Attached: 1539714671002.jpg (2620x3416, 1.74M)

>2070 barely faster than a 1080
Shamefu dishpray

and don't forge the debacle the last 2 drivers for nvidia's 20xx series cards and incompatabilities with BenQ monitors...

hell i'm about to toss in the towel and return my 2070 for a vega card ... and i haven't had a single ati card in any of my pcs in over 15 years.

Keep on posting that. I'm sure someone believes your autism.

But battlefield 5 is shit. Why should I care about this

Attached: RLTdgzN.png (989x882, 148K)

>They ALL lose to RX 570 when goytracing is on
SEPPUKU TIEM

Because it is relevant to the discussion of the technology being used and to future development.

Why 56 run better than 64 ?

See , then.

If you like playing garbage shit then sure

His benches are the only ones where amd consistently beat intel and nvidia how could that not ring a bell for you?

Attached: untitled-22.png (711x668, 25K)

It's not? There's no 64 in that chart.

goddammit best buy why don't you have vega 64 cards?

Ofc theyd need a golden sample of the vega64 and highend cooling aswell for the bench not seems off.

WHEEEZE

Attached: 1440.png (500x1170, 88K)

hardocp.com/news/2018/11/14/evga_2080_ti_xc_catches_fire_in_spectacular_fashion

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

Attached: 3e73Amx.png (642x480, 906K)

Holy fuck the nvidia housefire + no drivers meme came true

>Holy fuck the nvidia housefire + no drivers meme came true

KEK willed it

>nvidia housefire meme came true
It's been true since 4xx, kid.

>hardocp
aint clicking that shit nigga

heh nothin personell kid

www.wired.com/2007/11/case-mod-pc-as
Wired wasn't too bad in 2007.

Silky smooth 29.9fps

Attached: 2160.png (500x1170, 85K)

There's a very good reason why I've deliberately ignored and skipped ALL of nGreedia's GayPoos since after the GODLIKE PERFECTION that was MSi's nForce GTX 285 SuperPipe (the VERY FIRST TwinFrozr. It also had RED PCB...yes, an nVidia card with a RED PCB, lol), and right until 980 Ti came out. Been buying strictly only Radeons (starting with HD 6850 WindForce from Gigabyte, the BEST HD 6850 ever made) in that 285 ~ 980 Ti window, never regret it any even once for a second. Fuck nGreedia.

How embarrassing.

Helios 500 with Ryzen 2700 and Vega 56 is less expensive than one with a 1070. Cheaper and faster

Too bad it's not an ultrabook. Will have to wait for Zen 2-based APUs to come out, to get muh 8 cores and RX 580/Vega 56-tier integrated video with only SSDs, 16GB dual-channeled RAM and 15"~17" IPS screen in a notebook. Cannot fucking wait. So close now.

No it's an ugly heavy gamer shit piece of plastic sadly. But superb performance, good-ish keyboard and a nice albeit low-res screen is enough for me.

>300W TDP
All GPUs on that list are absolute garbage tier, except maybe 1060.

time to buy turing goy

Attached: 1470457029422.jpg (263x383, 41K)

Vega 56 is 185W TDP.

/pcbg/ is the intel shill bunker

>His benches are the only ones where amd consistently beat intel and nvidia

Except his benches almost never show that you retarded lying faggot.

Anything above 120W is unacceptable.

so pretty much all nvidia cards past the 1050ti?

Should I buy a vega 64 for $400? Or keep holding for a 1080 for $400?

Thanks

Attached: 1519102454830.jpg (905x881, 266K)

Maybe. I don't really care about nVidia since they're shit.

The GTX 1080 Ti is already selling out due to not being made anymore. The GTX 1080 will follow suit soon after I would imagine. Prices are only going UP for old Pascal. Crazy times.

A retard-a-you.

so is that a yes?

Attached: concernboris.jpg (693x663, 54K)

NOOOOOOOOOOOOOOOOOOOOOOO MUH GAYTRACING

Attached: 1515405625810.png (675x827, 35K)

HE FELL FOR THE

GAYTRACING
>GAYTRACING
GAYTRACING
>GAYTRACING
GAYTRACING

MEME LMOA

NONONO BROS AHAAHAHAAHHAHAHAAHAAHAHAHAHAHAHAHHAHAHAAAAHAHAAAHAHHAHAHAH

>2070 barely beating a Vega 64

OH NONONONO

ALWAYS BET ON AMD BABY

CORES FOR DAYS

Attached: 892883044883.gif (205x223, 1.1M)

Whats the deal with Turing? I never bothered to read any in depth analysis when the cards launched. I thought they added a bunch of quarter precision ALUs specifically to handle shit like ray tracing and meme learning without impacting the rest of the GPU as it crunched gaymen graphics.
Why is ray tracing causing such a massive hit to performance? Is it memory being taxed?

"Today, Battlefield V enabled support for DirectX Raytracing with the latest patch. We tested this update using the RTX 2080 Ti, RTX 2080, and RTX 2070, with shocking results. Even though only reflections are raytraced, the performance hit for DXR is more than 50%."
Congrats to early adopters

you cant overclock non k

Just look here , record it in your mind, then sit and just wait for 7nm Radeons to come out the next year. That is all.

It doesn't, see .

>Whats the deal with Turing?
Literal SCAM, a FAD.
Read ~ ~ ~ . Then consider that 2080 Ti for 1200+ Illuminati tickets. Then look at OP pic, here , here , and here . I hope you're able to count 2+2 in your head.

My Vega56 @ 112w power limit is about halfway between RX580 and 1070 performance. And that's the limit so it can run even lower power in less demanding game.

Not good price:performance, but good perf:watt.

Why would you wait to get a worse GPU instead of getting the better GPU now ???

You can find my posts calling this the day that RTX was announced:
RTX is a literal scam to pawn off failed Quadro yields onto gamers and increase their margins with Quadro cards (which are actually good).

You can also find posts where I calculated the true ray tracing performance increase, and as you quote it's not sufficient.
Nvidia invented a new stat that is similar to "millions of unshaded polygons per second" when it's not drawing unshaded polygons that's the bottleneck... it's the shading. But that compared that stat to SHADED rays per second of other cards. In reality, RTX is about 2.5-3.5x faster in ray tracing than Pascal which is not sufficient. It's not the 10x or whatever they claimed.

Also worth noting when comparing Pascal to Turing RT performance is that Turing has more CUDA cores.

The saddest part? The cope of RTX buyers trying to defend this. It's too late for them to send their cards back and all they can do is try to convince others that they weren't retards who let themselves get scammed despite the warnings from others.

>more CUDA cores
.