THE DRIVERS WILL FIX THE GAP

Wait for navi right guys?

Attached: cucked.png (917x519, 74K)

Other urls found in this thread:

tomshardware.com/reviews/far-cry-5-performance-benchmark-ultra,5552-4.html
digitalfoundry.net/2018-04-11-far-cry-5-vs-far-cry-2-engine-analysis-a-decade-of-tech-evolution
techadvisor.co.uk/test-centre/pc-peripheral/best-graphics-cards-for-gaming-2018-3217721/
techradar.com/news/computing-components/graphics-cards/amd-vs-nvidia-who-makes-the-best-graphics-cards-699480
guru3d.com/news-story/asrock-first-phantom-gaming-graphics-cards-will-be-released-on-19th.html
twitter.com/NSFWRedditGif

just wait for Navi bro

Price différence ?

>average of 12-25% maximum performance difference
>average of 200% price difference
I'm pretty happy with those drivers.

The MSRP of the Vega 64 was $499, while the GTX 1080 Ti had a MSRP of $699
But that doesn't matter because of crypto, the Vega was actually selling for more than the 1080 Ti because it seems to be better at the mining algorithm. No one can find either card for its MSRP

navi is allready confirmed poorfag only

What are you smoking the avg difference is over 40% for about the same in price. 1080tis went for 700s, Vega 64a went for 500s. If you want to compare prices since mining made the prices explode Vega 64 cost abiut as much. The Vega is hot trash

Still not buying your GPU, Nvidiot.

Same difference in price*

150vs 200 fps.
There's no need for this.

Even if there was a clear cut difference I would still use the company without underhanded business tactics, and the one not funding the Israelis

2 cheapest on scan.co.uk

Attached: meh2.png (1202x247, 70K)

>fudzilla
>confirming anything
damn are you retarded

>tfw paid $729 AUD (~550 USD at the time) for a Vega 64 last year
Vega has been pretty good so far and it's nice being free of nvidia's botnet cancer

Attached: aKCAW5x.jpg (600x315, 26K)

>Cherry picked Gimpworks game
ROTT has been one of those games Nvidia shills just love. It used to be The Witcher 3 but AMD BTFO of that one so ROTT became the next title to lean towards Nvidia heavily. DX12 helps AMD a little but it's still a shit optimized Gimpworks game. Today it's games like the new Final Fantasy that is Nvidia shills favorite Gimpworks title to wave as a banner.

>What are you smoking the avg difference is over 40%
But OP (you?) posted that the difference is only 12%-25%.

>for about the same in price.
Just checked local stores, the Vega 64 is 700€ while the 1080 Ti is 1100€.

dude vega just isn't that good.

There are the same price in the uk so your shop is probably full of retards
>€
oh i'm so sorry

Anyways my local store doesn't even stock vega because its pretty much garbage compared to the nvidia counterparts.

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9PL0MvNzYxNjI4L29yaWdpbmFsL0Zhci1DcnktNS1GUFMtMTkyMHgxMDgwLVVs (711x533, 66K)

I got a Vega 56 for £380 delivered. Flashed the 64 BIOS. Tweaked it and get stock 1080 performance easily.

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9SL0MvNzYxNzM2L29yaWdpbmFsL0Zhci1DcnktNS1GUFMtVWx0cmEtVEFBLVZl (711x533, 59K)

More like it doesn't stock vega because the larger retailers get them since they get sold out in seconds to cryptoniggers.

>Buy Vega 56 for 400€ MSRP
>Watercool it
>Clocks like liquid cooled 64 edition
>Can sell it for more than original price to cryptominers any time

Attached: 218c62b9afde62aad3de264c6e61b537.jpg (1000x1100, 123K)

Where?
When?

Context please

Currently you won't get it for anywhere close to that unless you mean 2nd hand aka buying ex miner cards

also

>i have to jump through loops just to get performance of a card release May 27th, 2016

Attached: btfo.png (398x198, 14K)

For the price I paid my Vega 56 is plenty good enough.

Attached: Result_-_2018-04-12_18.33.27.png (980x909, 82K)

tomshardware.com/reviews/far-cry-5-performance-benchmark-ultra,5552-4.html

Sorry thought you meant the source. I got it on launch day. Tough titty if you didn't.

BTW my clock speed is 1480Mhz @ 820mV. 3DMark always reports the incorrect clock rates.

The benchmarks were performed at stock. They did not 'jump through hoops' to get those results. I do it because I can get an extra boost. Same as Nvidia users can OC their GPU's if they so choose. Stop trying to throw bullshit in peoples way.

Shills love to cherry pick. We can all do that. AMD sponsored game. RX 580 vs GTX 1060

digitalfoundry.net/2018-04-11-far-cry-5-vs-far-cry-2-engine-analysis-a-decade-of-tech-evolution

If it was not for crypto niggers the RX Vega 56 would be a better buy than the GTZ 1070 Ti based purely on RRP.

techadvisor.co.uk/test-centre/pc-peripheral/best-graphics-cards-for-gaming-2018-3217721/

'So, which is better? Neither

There’s so much to love, and in some instances “dis-love,” about both Nvidia and AMD graphics. In the end, both of these companies rely on competition with each other to thrive. Suffice to say, the Nvidia vs AMD debate requires that you understand there’s a reason Radeon and GeForce GPUs are so similar in performance right now.

Each company is doing its best to keep up with the mindshare of the other, and that’s good for us. They’re basically fighting for our money, learning from each other’s mistakes and legislating marked improvements along the way.

It’s up to you who wins the fiery contest of Nvidia vs AMD, although we will say this: Nvidia is unmatched in the 4K market right now. If it helps any, the GTX 1080 Ti is your best bet if you want your PC to keep up with the likes of your Ultra HD display. Otherwise, Nvidia and AMD graphics cards are about the same, at least for the time being.

These are the best PC games you can play right now'

techradar.com/news/computing-components/graphics-cards/amd-vs-nvidia-who-makes-the-best-graphics-cards-699480

You shills need to fuck off back to /v/

nvidia doesn't have proper drivers on Linux. Only proprietary cancer that runs kernelside.

>Radeon and GeForce GPUs are so similar in performance right now.
This isn't even remotely true. The 1080ti fucking destroys the Vega 64 at every level as seen in OP and the Vega 64's die is notably bigger than the 1080ti. It's not a competition at all. AMD is two generations behind. They have a ~600mm^2 competing with one like half the size.

user they are not the same cards

Vega 64 was never aimed at the GTX 1080 Ti but the 1080. It matches that in newer games. Older less optimized games that rely more on single core and DX11 calls (especially Nvidia's Gimpworks and heavy use of tessellation) still suffer on AMD GPU's. Games like Doom. Wolfenstein New Order. Farcry 5, Ni No Kuni 2 etc show where gaming should be heading.

>I got a Vega 56 for £380 delivered. Flashed the 64 BIOS. Tweaked it and get stock 1080 performance easily.
Hi are you brain damaged?

Err...your point being?

BTW
guru3d.com/news-story/asrock-first-phantom-gaming-graphics-cards-will-be-released-on-19th.html

No. But you must be. I am referring to stock GTX 1080's with no OC's of course.

This is a vega 64 vs 1080ti

also polaris is a pretty good arch compared to vega.

you tried to put words in my mouth

i was clearly talking about this guys post

nothing to do with what you posted

actually you posted that after i left the post

ABSOLUTELY SEETHING

u wut m8?

Attached: wut.jpg (2560x1440, 469K)

???????????

RX 580 isn't vega

this thread is about vega 64 vs 1080ti

what is hard to understand?

1080 die size, 312mm^2
Vega 64 die size, ~500mm^2

The Vega die probably costs 2 to 3 times to manufacture.

Nah. We derailed that long ago m8. When you start a thread it always wanders into a X vs Y shitposting competition. Or are you new?

>checks prices
>RX 580 = 359.99
>GTX 1060 = £265.49

Is it really worth 100 more for 5fps?

Well all things being equal and crypto niggers not existing then no. The only reason the GTX 1060 6GB is better value now is because of that very fact. Nvidia got lucky in that regard. AMD got lucky in the sense that they are selling shitloads of GPU's but marketshare means they are fucked for gaming and could face a flood of used GPU's if the crypto market crashes hard enough.

Anyhow to get back on the OP image it sure looks like Vega 64 is performing great with those minimums. But Nvidia shills will insist the averga is best epeen.

>average
Keep going I can shill for both sides all night.

depends really

the minimums are more important imo.

example

GPU1 = 112 FPS avg/1% = 53 FPS/0.1% 37 FPS

GPU2 = 96 FPS avg/1% = 57 FPS/0.1% 44 FPS

I know what i'd choose.

Yes but Vega 64 was going squarely up against the GTX 1080. A $500 GPU. Not the $700 1080 Ti. The fact that it is squaring up to the 1080 Ti in minimums is pretty damned good.

Can confirm. Back when Vega64 was launched it was priced comparable to the 1080 but locally in Sweden it's priced like the 1080TI. The RX580 is also screwed, it competes against the 1060 in performance but it's priced like a 1070.

That's the situation here too, Vega64 vs 1080TI.

That's what makes it a bad buy. I do hope the soon-to-be-released Vega refresh ("VEGA12") actually brings cards to the market at MSRP. I use Linux so I don't do novideo cards (I call them that because they have a history of not being able to do the simplest things like playing videos without the closed source binary blob driver).

I don't know.. I was hoping the newly released ETH ASICs would bring GPU prices down, but now. Well, it helped, Vega64's and RX580's are in stock now, so there's that. But they are so overpriced they might as well not be.

The main thing better driver should did was enable uarch functionality, however the drivers applied primitive shaders to everything never panned out, then we also have tile-based rasterization which seems to be half implemented at best. There's a lot of shit in Vega that should help with bottleneck alleviation however most of it never panned out.

As it stands right now Vega 64 is fucking worthless, however after around MSRP Vega 56 is more than worth it as it's out $400 card that plays ball with a 1080 and properly undervoltage draws the same or less power, while at the same time overclocking higher.

Needless to say it's a bit disappointing and I don't know which side to blame it on the hardware side or the software side. I really wish AMD would be more open with this.

The managerial side is to blame. Raja fucked off because Intel were obviously temptinng him with more cash from the get go. The sebbatical etc was a ruse. Fucking traitorous pajeet.

BTW the magic drivers were never promised. Only hinted at. I believe their may be an issue with Vega v1 that meant they just could not get it working or were too imcompetent as a driver developer. That or they saw that it simply was not worth it considering the crypto niggers were buying all the cards anyhow.

>Fucking traitorous pajeet
>how dare he betray the AMD empire! REEEEEE

I don't even have words for how disgusting you corporate cock suckers

Honestly all we have here to go by is rumors And that one the hardOCP guy. If raji to actually fucked AMD he be in court, with how long he's been gone that they could easily go through his managerial decisions and figure out if him going to Intel while fucking AMD was the cause.

What I believe happened is AMD hamstrung, they forced him to work with GCN instead of building something new, or work heavily within the GCN architecture so you wouldn't have to completely rewrite a driver. Unless something else comes out this sounds most likely why he would move to Intel after AMD and why AMD's had the same problems since Fiji. Every interview that you had with him he's always said the same things he wants to build the next GPU that's great however I don't think he was allowed to it AMD.

Now for what I said is correct or not it's hard to judge him based solely on that, I mean none of us are familiar enough with the AMD drivers stack and AMD hardware side to really say who was in the right with that decision. If the bottlenecks are able to be alleviated while keeping GCN GCN still is the most powerful GPU architecture you can get for general compute, at least given size, and with everything AMD does to advance GPU graphics it all relies on general compute so it makes sense why they want to stick with it. But if there is a hard limit to how much that GPU can actually put out they have to move on. Vega sounded like what they focused on with it was bottleneck alleviation and we have no real proof that they did any bottleneck alleviation at all but raji wasn't sued.

He finished up on the project and left. Pretty much he played his role but he smelled the money. Can't blame him really. But it would be nice to see some people have the integrity to see it past launch and have some faith in their work. But money talks and bullshit walks I guess.

He's thinking that from the time he came back until he left he actively sabotaged AMD hardware. Which if that's true I hope he dies, he actively fucked us. But I honestly don't believe this narrative.

No he sold out. He was now pretty much finished on Vega and he could now focus on making Navi his own. So why leave? Maybe he just didn't get on with mommy Su.

Management wouldn't allow him to move off gcn, at least that's what I believe, he was not allowed to make his own gpu, so he was stuck trying to fix the problems that gcn ran into.

going to intel means he has no prior work he has to use or work within, while having access to all nvidia patents from 2017 back.

I expect the conversation went like this
>Intel
'Hey Raja wanna take over out GPU department? You will get full control. No string attached and a salary rise as well as stock options to boot.'
>Raja
'Do I get my own street to shit in?'
'Sure'
'I'm sold!'

The software team should be able to carry the project through launch and post without needing the highest level manager to babysit them. He was VP and chief architect of the entire RTG division, there were probably several layers of management between him and the engineers. At best he would be a CTO, laying groundwork for the high-level approach they'd take on Vega, not the details. At best he'd be picking toolchains and engineers to lead the projects, not doing any actual engineering himself. And as far as a businessman he did his job well, he can take credit for getting Vega into Intel's new SoCs, a nice little boost for RTG.

>What I believe happened is AMD hamstrung, they forced him to work with GCN instead of building something new
They're saying brand new architecture in 2020 IIRC. Raja's almost guaranteed to be the mastermind behind it considering he left just months ago. Under 2 years for a new architecture is nonsense, these new guys are just picking up his legacy. Hopefully they don't dumpster it like they did with PS drivers.

>focus on navi
>focus on an architecture that at this point in time should already be getting test samples back to fix bugs
You brainlet Navi's already done. You don't redesign it over and over again until the day it launches. You design it for the first 18 to 24 months and then start getting back real product to find bugs to pick out.

Most likely Raja finished overseeing the early stages of another project slated for sometime minimum 3 to 4 years down the line, launched Vega, and then left cleanly so his reputation doesn't get dumpstered.

Lol, retard trying to insult.

Damn pajeets.

>GCN bad
Absolute baseless nonsense.
Do also keep in mind GCN is an ISA, not an implementation of it.

>THE DRIVERS WILL FIX THE GAP

this is unironically at least conceptually possible, if not at all likely of actually happening.

Vega was designed with the premise that they could use a geometry/FLOPS ratio of less than half their previous processors if they could discard more triangles before in-memory descriptors were generated from them.

This stems from console devs figuring out several years ago that they could write compute shaders to eliminate tons of triangles before they were passed to the PS4's/xbone's actual HW geometry units. Vega's new primitive shaders are basically this early culling optimization, put into HW, with the goal (utterly missed) of being able to have the driver generate the write invocations on even existing games using legacy APIs.

As such, they kept the same 4 geometry units for 4 primitive shards of up to 8x8px each and thought this would be more than enough if it could get the same throughput as an equivalent non-discarding GPU with ~12 tri/clock.

If they somehow got this working, Vega would catch up to, if not surpass, the 1080. The fact that this still hasn't happened even to a a limited degree, despite being desperately needed, implies that there are some nigh-insurmountable challenges impeding this.

Are you retarded? GCN is an ISA and the architecture. AMD literally refers to GCN as a microarchitecture. It's just a name for both

It means either them admitting that Vega has a hardware bug they missed (they would not like to admit that obviously) or their devs are shit. Either way I doubt they will include it i later versions of Vega because that would reveal that there is indeed a problem with the first iteration of Vega. It's probably why they left it to game devs to work around it (perhaps with a non-disclosure agreement included).

the pro vega cards have utterly absurd geometry throughput in the handful of apps written with non-public APIs to use the primitive culling, so it's almost certainly not any form of pure silicon bug.

more likely it's a deep API to hardware design mismatch, or maybe a fundamental driver architecture and implementation problem.

>REDDIT SPACING