AMD fags sure got told

AMD fags sure got told.

Attached: Screenshot_2019-02-12_21-15-08.png (627x578, 128K)

Other urls found in this thread:

youtube.com/watch?v=IYL07c74Jr4
twitter.com/SFWRedditVideos

Just wait

Will my 1070 gtx be obsolete by then?

Fuck I've been waiting forever

GTX 1070 is still king for 1080p, maybe for 1440p Navi will be better, unless you fell for the 4k meme then yes, that 1070 is obsolete.

You're either impressively retarded, or a really bad shill.
1070 is a 2-years old card and performance uplift in the next gen has been less than 30%, even worse if you factor in the price.
And Nvidia won't release new lineup any time soon, so your precious 1070 won't be obsolete for at least another year.

>waiting forever
>my 1070
and I'm still here with my hd 5750

Attached: amdfanboy1235.png (653x726, 91K)

Yeah... with how slow things have been progressing on GPU side you'll have to hold on to your 1070 for quite a while... unless it dies. Nvidia wasting die space on RTX memery isn't helping things.

Is this where i just buy a vega 56? I don't think i can wait another whole year.

8 months. think of it as if you're being pregnant and the amd is the father.

What do you have now?

I'm starting to think I'd rather just lose 300 bucks now and enjoy some decent performance for 8 months then suffer.

1060 6gb and It's shit. I'm planning on selling it.

>bought a 580 as a stopgap until navi came out
>should have bought a vega 56
Kill me.

Hasn't it been Q4 2019 for a while now?

1070 went obsolete when Vega 56 came out

Attached: 1549794138621.png (900x930, 301K)

used vega 56's are a good choice right now, they are being sold cheap, just be sure to check if it works before you buy and give it a nice service, clean dust and apply new thermal paste if you are in the US, warranty void sticker does not apply there. rest of the world will void warranty if sticker is damaged/removed

I WANNA UPGRADE MY VEGA 64 ALREADY AYYMD COME ON GIMME SOMETHING GOOD FUCK

i just replaced my 970 with a 2070.

Attached: terminator.jpg (800x450, 63K)

So you replaced your 970 with a 1080 only 3 years later. Grats.

you're probably gonna have to wait like 2-3 years until AMD finally ditches GCN. Unless you'll be satified with a few % improvements there are gonna be in the GCN releases meanwhile.

To be fair the problems plaguing vega like someone else stated a few threads ago is a fucking bitch to fix. We won't see performance like pic related unless AMD optimizes the ever living fuck out of the next GCN gen and does tighter voltage binning.

We know because of pic related vega DOES have the horsepower to make nvidia bleed out like a butchered pic but only because the devs worked themselves half to death optimizing for the shader heavy architecture of vega.

Anyways fingers crossed. RTX 2060 performance for around $200 is my greatest hope rn.

Attached: radeon_VII.jpg (1920x1080, 186K)

>Minimum framerates are 60fps at 1440p, at 1080p maybe double.
>AMD sponsored title, still pretty much playable.

Sooooo, what's that picture for again? that's doesn't prove it's obsolete

Attached: e5upSHu.png (512x380, 216K)

Learn to read you massive wanker. Basically vega is fucked but the performance is there. If you play close attention CMAA is used instead of the standard FXAA/MSAA meaning vega also sucks balls at anti ailaising as well.

navi is taking so long because unfucking vegas was harder than expected.

>what is 4K

>Basically vega is fucked but the performance is there
Only if you go really far out of your way to optimize shit specifically for vega. Basically AMD has to sponsor any game they don't want to run like shit on vega.

Is getting an AMD cpu and a Nvidia gpu a good combo?

No shit which is why navi is taking eons to finalize and release. The good thing about AMD is because they're the underdog they can't afford to jew people on the massive scale that intel/nvidia have been able to. See intel bribing oems to use their chips and nvidia adding 3 trillion polygon wooden planks on crysis 2 exclusively for amd gpus.

I apologize i just answered as soon as i could. I really hope AMD can make another arch even if they don't release anything in a year or two they still have enterprise and console deals so i rather wait for a good mainstream consumer product than buying converted enterprise cards that are mediocre at gaming, i mean sure, not everybody buys GPU's to game but they market themselves as a gamer company mostly.

Attached: 8ac.jpg (680x680, 57K)

When will Radeon be sold to Intel 1 year maybe 2

It's a niche resolution, at least for gaming, when midrange GPU's handle 4k at 60fps ultra and enthusiast grade GPU's handle them 150+ fps at ultra then i will move on. For now my gtx 1070 gives me really good performance on 1080p 60hz monitor and i rather move to 144hz same res to a 4k monitor.

Attached: 1531379630446.png (719x768, 753K)

Depends what you're aiming for. If you want budget mid range the 580 can be had for ~$200 and nvidia can't compete with it for the price/perf. On high end it's a little tricky because the RTX IS ~10% faster than a vega 56 for the same price BUT has 25% less vRAM.

So unless you're willing to pay an arm and a leg for the 2070/2080 then AMD is still favorable especially with undervolting applied. The 1060ti from nvidia will probably flop especially since it will retail for around $300 and have rx 590 performance.

AMD more like GAYMD amiright fellow N-Chads?

Lol amd can't stop shipping with dogshit drivers, who cares if a card is theoretically 20% more powerful if you can never actually get that performance out of it.

Except now you can. 2016 to 2018 polaris drivers saw ~20% uplift in performance. Before the 580 couldn't even compete with a 3GB gtx 1060 but now it's crushing the 6GB 1060.

>Just buy an AMD card and then wait two years for its performance to not be dogshit.

Unfortunately true, if you want max possible performance day 1 nvidia is as good as it gets. But if you're okay with 20% lower performance than nvidia day 1 and slowly claim that performance gap back with driver updates then AMD is a good long term value.

Whether this applies to navi is unknown but not much has changed for a solid decade so I wouldn't be surprised if navi got better performance with driver updates over 2 years.

And then you give birth and it turns out to be a retarded nigger.

However, if you do optimize for Vega it will
>Look great
>Play great
>Even the competition's cards will play great and you will be happy with your purchase
Meanwhile, if you optimize for nvidia, your game will
>Look great
>Be unplayable and heaven forbid you didn't buy nVidia, because then you're fucked even worse.

wtf, I thought this was just a driver problem with nvidia.

Well I would say a middle kinda budget the most ever paid was for a 70 geforce back in 2013 for i think around 350? I wouldn't want to go beyond that kinda range of price for a card. So I guess my budget would be around midrange. I do some games but only have 1080p and I wanna be using linux. CPU wise I was thinking the Ryzen's looked like a good value so I'm more likely leaning towards those but I've never had AMD anything before.

Too bad amd only has 15% of market share.

Leather Jacket Man sure knows how to sell his shit.

Start up another AOTS benchmark, make sure to let us know when it starts being fun.

If that's the case then am rx 580 with undervolting is as good as it gets especially if you're on 1080p and want to use something that works well with linux. Once you get to vega 56/rtx 2060 territory it's better if you have a 144hz 1440p monitor to take advantage of the performance else that 1080p monitor becomes a huge bottleneck for the GPU.

As for the CPU I would strongly advise against intel especially if you're gonna use linux since the security vulnerability mitigations on intel have a significant performance reduction in certain tasks. Currently ryzen 2XXX AMD processors have about 5% higher IPC than intel coffeelake processors before any mitigations are applied so the single threaded difference of say a ryzen 2600X and an i7-9700K is about 10% and that can affect vidya anywhere from 5-10% at 1080p.

HOWEVER the catch is ryzen AMD processors were optimized for servers and in ways even I can't understand the infinity fabric that affects core latency and other important aspects of the processor is affected by system RAM speed and CAS latency. Going from a 3000MHz CL14 2x8GB RAM kit to cheap 2400MHz CL16 2x8GB RAM kit can have anywhere from 10-15% performance drop in gaymes for example. Though using faster CL14 RAM gives single digit performance gains and isn't worth it imho as the price tends to skyrocket as well.

Navi won't be significantly more powerful than a V64 for a long time. Your options are either buy a V7, wait till 2021, or go Nvidia when they go 7nm.

Then why does nvidia have to sabotage games in order to "win"?

youtube.com/watch?v=IYL07c74Jr4

You sound fanny flustered.

Attached: 1540212554893.jpg (657x527, 49K)

I hate you Lisa!!!

Attached: 1538929233291.jpg (633x758, 156K)

Blame that pajeet they kicked out, vega launch bombed because of him.

Why can't amd stop losing?

You'd say that Vega is the Bombay?

I fell for the Jow Forums PC building guide jew as well when they said 1060 6gb is plenty. Can we hug breh?

actually 4 years, got my 970 few months after launch(december 2014)

>forced to render 3 trillion polygon planks of wood because of nvidia "optimizations"
>losing
The only loser here is the consumer who gets fucked by big corps ripping them off.

I've got an rx580, if I undervolt say 50-60mv is it even worth bothering with? What are we talking about in temps? 1 maybe 2deg?

Mommy Lisa needed to clean after the shit FX fiasco, she can't clean the GPU side at the same time, give her time.

You know how this shit will go.
Mid of the year, AMD will completely destroy intel in all metrics, but they won't release any sort of video card that can compete with nvidia, so we will all go back to 1999, where an AMD CPU and Nvidia GPU combo was the bomb, given the 3Dnow optimizations on the Nvidia driver.
(alternatively, you had do combine the 3Dfx card with a intel CPU, and the 3Dfx drivers were optimized to take advantage of the superior intel FPU).

>Nvidia is a monopoly and charging hundreds of dollars more than their GPUs are worth
>AMD keeps fucking around and fucking up their new GPUs
>AMD keeps using GCN for no fucking reason other than they're cheap as hell and want to continue recycling that garbage design to save cash rather than giving consumers a reason to buy their GPUs
>Navi was supposed to launch in 2018, now could conceivably not show up until 2020 when their horseshit "advantage" at 7nm becomes worthless
>somehow a GPU that is re-using GCN gets delayed a year and still probably going to disappoint

these past 2 years have been so shit for PC hardware, my god

Attached: 0l.jpg (807x659, 42K)

It's mainly meant to increase performance as a lower power draw allows the card to stay on turbo clocks longer or indefinitely depending on how many mV you can crank it down to. However as a side effect it will also lower temps by 5-10C with some claiming above 10C deltas.

I strongly recommend it as you never know who gets the ~1,000mV @ ~1,340MHz silicon lottery win but unless you're super unlucky you should be able to get ~1,100 - 1,200mV at the same frequency as AMD overvolted most polaris chips on purpose to allow for higher chip volume output.

How does having a lower screen bottleneck the gpu? Wouldn't having 4k or something bottleneck it

Thanks for that. I'll give it a go through wattman,

Not really, regardless of nvidia v amd malarchy we've seen PCs in general become more and more affordable as they race to compete with consoles. Rx 570s which can do 50-60 fps on 1080p max can be had for under $150 brand new these days and that value wasn't possible 2 years ago especially because of the buttcoin mining craze.

Bottlenecking means when something is prevented from operating at its maximum performance, at 4K the monitor is being bottlenecked by the GPU if it can't maintain 60 fps which unless you play games on the lowest settings isn't possible for cards like the 580.

In this case if a 1080p monitor were used with a vega 56/rtx 2060 they would be rendering 150+ FPS so even a 144Hz 1080p monitor would either have massive screen tearing as the monitor fails to keep up with the frames rendered or GPU utilization goes down to like 60% as the GPU waits for the monitor to keep up.

Why would games optimize for 15% of the market that bought into the amd meme?

That's not the point here you donkey. These "optimizations" for AMD weren't just devs being lazy cunts as usual, they were intentionally malicious on purpose to artificially throttle performance on AMD graphics cards to make them look bad. Nvidia in one way or another paid the devs to do this.

Because the consoles use AMD stuff.

Couldn't I just set limits on fps like vsync or something so it doesn't go too high? Then when I want a better screen in the future turn the limits off?

Yeah yeah we get it, you get a psychological boost from supporting the underdog, combined with post purchase rationalization makes you want amd to come out on top, it's okay to make mistakes user.
And consoles have been holding back pc games due to publishers pushing parity requirements for years, not sure that is the hill you want to die on.

Sure why not, as long as the 60% overall GPU utilization doesn't bother you like it does me then no harm done.

Consoles are not holding back PC, Autodesk is holding back PC.
Well, general costs of keeping the insane R&D for better graphics.
We're not even nearly close to using the full potential of the Xbone, of even the switch because it got too expensive.

And no one, NO ONE will spend 100 fucking million dollars on a PC exclusive.

It's not even about supporting the underdog, why would you support companies lying to consumers and sabotaging competing products from other companies just to keep profits high? That's like being a proud cuckold.

>It's not supporting the underdog.
>It's just (10 reasons why amd is the underdog).

I didn't mention AMD or support therein. I'm questioning the undying loyalty people have to companies that fuck them over. It's this kind of loyalty that has allowed such companies to keep making mad dosh because people are too stupid to realize they've been had.

Other companies such as Apple, Theranos, facebook, amazon, google, and really any insurance on earth come to mind.

>Comparing being an amd fanboy to Theranos.
Lol nigga you dumb.

Well I mean there's budget when building and why waste a perfectly fine part of my current stuff at least initially. But yeah I guess it bothers you more if you used to higher than 1080p.

I'm not comparing that at all. The people and companies who swore undying loyalty to some GMO part vampire extra pasty white girl made her ascend into a fucking billionaire before getting absolutely fucked when the whole sham fell apart. The moral of the story here is if people had more brains than money shit like that would have never happened and I'm complaining about that.

Vega is Bombayymd

You do understand that you are just posting more and more justifications for why AMD is the underdog right?

I just want it for the name

Attached: lain 57.jpg (480x360, 15K)

because with jews you lose

I meant that you could have just bought a 1080 3 years ago for the same price and same performance as a 2070 dumbass.

The rx 570 and 580 are the only saving grace of this last generation but someone that wants more performance to push their 144hz monitor or their vr headset this gen is absolute dogshit.

Have you noticed how AMDrones are being silent over this, and slowly trying to damage control?

What about the vega 56?

Prices were way too high for way too long. It's a decent option now that they are dropping to around 300 but It's still not a great card to be pushing the latest games at 144+ fps at max settings. Also I tried the card with VR and It's still struggling.

It seems to be best option at the moment but that's only because every other option is even more garbage.

Ok so a few things. Not a part of this argument but I can't ignore how confident you are despite your ignorance toward what you're talking about. FXAA and CMAA are both forms of morphological antialiasing and are a post processing technique. CMAA is less blurry and better at reducing edge aliasing than FXAA. MSAA performance isn't really important anymore as it cannot be used with deferred rendering which nearly all modern games use. Finally, Vega's problems have more to do with difficulty balancing shader performance (ALU cores) with pixeling and geometry performance. Do a quick googling of things before you try to act as if you know how something works, it'll save you this embarrassed feeling you're currently experiencing.