Is an RX580 good enough for 1440p?

Is an RX580 good enough for 1440p?

I honestly can't tell the difference between 30fps and 60fps, so yeah.

Attached: csm_AMD_Radeon_graphics_logo_2014.svg1630_b5cb6c22f9.png (240x203, 35K)

Other urls found in this thread:

amd.com/en-us/press-releases/Pages/pushing-boundaries-for-2018jun05.aspx),
notebookcheck.net/Mysterious-AMD-Vega-12-GPU-info-leaks-out-plus-Vega-20-AI-features.302274.0.html
twitter.com/SFWRedditVideos

If you can't tell the difference then yeah, its fine for you.

Peronally at 1440p my 1080Ti can't always keep settings above 60fps depending on the game. I play at actually max settings but for Warhammer 2 for example, I have to lower a couple settings, and for Kingdom Come, which is not very optimized.

bs, I bought a 144hz screen and can't go back

this desu

I mean for lighter games probably (CS, Overwatch) but for more demanding / less optimized games (Pub G, Kingdom Come) I'd say you'd get a pretty mediocre experience

No my 1080 oc barely can do 1440p 60 and it's twice as powerful
Wait for 7nm

My 390x does 1440p, it's just as question of what you want to do at 1440p

How would two rx580s in xfire compare to a single 1080?

I'm thinking I could put one in now and another when I get more money. I was thinking that would handle anything I throw at it for a few years to come, or if I were to buy a 4k monitor. Would it? Now I have my doubts. I'd rather not wait, my current card is beyond terrible.

You can do 1440p 60 in plenty games with a 1070. What unoptimized mess are you playing, Witcher 3?

AAA gaming. Won't bother with 144hz or anything for now.

NEVER CROSSFIRE OR SLI
IT'S A BAD GIMMICK.

The way the drivers are written is that it's alternate frame rendering. One frame is rendered by one card, and then the next is rendered by the other. If you think about this you'll realise it makes no sense. The GPUs aren't rendering the frames twice as fast, they're just making twice as many. FPS goes up, but half of your frames are just what the predictive algorithms think will be the next frame at best, and it just waits at worst.
In reality you're getting the same frames twice and your FPS counter goes up so you think it's smoother. It's a complete scam. Get the more powerful card, every time.

Should be right if you sit at 60fps. Worse case, you might have to play on very high instead of ultra

Alright, well that's not good news at all, but it's good to know.

So I guess I'll have to buy a nvidia 1080. The price difference between that and the 1070 is ridiculously small anyway.

Yeah the 1070 was a bargain at $380, but at current price just get a 1080.

An rx580 is slower than 780ti. Buy Nvidia

Pubg works fine. Most people play it on low settings anyway since it looks shit either way. You get 70-100 fps in 1440p with a 580. I would call that very playable.

+0.10₹ :)

You could get a single RX580 if you don't mind turning settings down. The important thing is just not to waste money on multicard.

>twice as powerful, does 60fps
>hes happy with 30fps
so you just answered that he would be fine

i had a 1600p screen for a little while with an R9 390 (basically the same as RX 580 or maybe slightly slower) and yes it played fine, just play on medium-high settings, maybe fully high settings.

My R9 290 does just fine at 1440p and the RX580 is like 25% stronger than that.

1080 Ti is shit that's why the only one I'd get is the pascal titan and oc that
In games that supported it a little better but dx12 and vulkan support is non existing for mgpus
Noo uronically is only ubisoft games
And in 1+ year that silky smooth 30-50fps will turn into 25 and below as games get more and more complex
Bad advice the polaris are 1080p through and through older games that are well optimised like doom 16 and other async compute heavy games will work well like forza gow4 and wolf 2 but most like nvidias approach
That Said depending on the game my 390x was just as fast with newer apis as my 1080 especially anything on vulkan.
Dx9-11 performance was so bad I had to get rid of it though Nd the drivers where horrid especially comparability wise with anything older than 2009
And the thermals where insane easily the hottest card I ever owned and it had massive tri fan cooler I put even bigger fans on 2x12cm hi cfm rpm.
My gtx 1080 oc to 2.1ghz is cooler

If you want to play at bare minimum
>get an 1050ti instead, its better

>can't tell a difference between 30 and 60fps

Is this what being an AMD buying Pajeet and living in a country where people shit on the street does to you?

It's one of those things where if you look reeeeaaally closely, you can notice a slight difference, but only when you look for it.

There is also no reason to go 4k because of this.

If you have money to blow on useless shit, do it, but you should also accept the fact that what you're doing is pointless.

>And in 1+ year that silky smooth 30-50fps will turn into 25 and below as games get more and more complex

Then you just turn one or two more settings down to medium instead of high. Or at that point buy a new midrange GPU which will then keep him playing at his acceptable framerate with high settings for another 2-3 years.
Not everyone needs the absolute best, just something good.
Also this is coming from someone with an ultrawide monitor and a GTX 1080, but i just get where hes coming from.

It will be 30-60fps at medium-high settings, so for you it will work.

Get a freesync monitor with your 580 and you're good with pretty much anything

Pajeet, you're mistaking motion for resolution. Get the fuck out.

>Is an RX580 good enough for 1440p?
Yes.
I do 2560x1600 or higher in most games and get 60+. Medium on a few settings, but high textures. It's what winds up looking best and running best in most cases with deferred renderers.

Ultra settings is a meme just made to make benchmarks run poorly. Doesn't make the game run better.

I used to have a 960 and be happy with it, it ran everything on medium sometimes high with some settings off like shadows.
OP said himself that he's fine with 30~60 fps so he can just tweak shit and enjoy his card too, the RX580 is not even that low end compared to current stuff unless he wanted to do shit in the vega64 and 1080/1080ti territories.

>I honestly can't tell the difference between 30fps and 60fps, so yeah.

wtf is wrong with your retina?

Attached: 20952464_1437357693020592_4776014581023637504_n.gif (400x412, 859K)

>Is an RX580 good enough for 1440p?
Yes. OR: It depends. Built a gaming PC for my nephew with a RX580 and he's using a dual monitor 1440p 1080p setup. All the games he plays run just fine at 1440p at medium settings. If you insist on running games on ultra high then some games will give you problems even at 1080p.

The difference between 1080p and 1440p is actually quite small in comparison to 1080p vs 4k. You'll see a bigger fps difference between medium and high compared to 1080p and 1440p in most games.

>Then you just turn one or two more settings down to medium instead of high
I honestly wonder if most of those showing high and ultra high settings in benchmarks and ONLY that are doing it to intentionally mislead. If there's graphs at ultra to highlight some difference and medium too then it's more realistic.

the difference between medium and ultra in most games seems to be silly things like more hair that you don't really notice and things like that. RX 580 can't do acceptable 1440p framerates with ultra settings but so what, it'll do 1440p 60+fps just fine with sane settings.

I don't think it's necessarily done to mislead, I think it's a way of showing how the card handles very specific heavy workloads based on how the developers coded them

true, and it's nice to know how a card handles heavy loads. But.. it also gives you the impression that you need a 1080ti to play games at 1080p. Thus the "misleading part". Just running two benchmarks, one with ultra and one with medium would illustrate that you don't need to buy a $1000 GPU to play CS GO.

Addendum: if you're running multi-monitor setups, you can have multiple GPUs without crossfire, and both displays will work normally, they just won't have to share GPU resources.

>My 1080 can't do 1440 60, a card made for 1440p.
>Buy vega instead
I wonder

>Says the 1080 Ti is shit
>My 390x was almost as fast as my 1080
Makes you think

If you can't tell the difference, it's fine yeah. Also depends on the kinds of games you play. If you only play LoL, you could even get away with a 560. If you play Warhammer, not so much.

Honestly though, if you're content with what you have now, don't upgrade. GPU prices are absurd at the moment because of miners, and the ultra-high-end bracket is lacking competition (literally just the 1080Ti) so prices are getting driven up from the top down.

Give it a few months. Miners are fucking off to ASICs, and the crypto hype has calmed, but the GPU price inflation is still there for the time being. AMD will also have a new architecture out, which hopefully will drive down prices in the high/ultra-high end brackets.

That's entirely dependent on what you want to play. It's good enough for 4K in some games.

>AMD will also have a new architecture out

no
still vega architecture

I'm right u kno

Forget about CF/SLI, neither company has given a shit about that for like two generations now, and support is cargo cult tier at this point

>can't tell the difference between 30 and 60
Well then maybe stop comparing those by looking at wallpapers

580 will give you solid 60fps on 1440

Holy fuck learn English before posting. ESL street shitters have no right being here. Go back to your call center job.

W/e fag
I'm comfy shit posting from me bed

Attached: 1528405058647.jpg (750x725, 61K)

Wrong, Navi will not be based on Vega

It's hard to trust AMD Radeon team after the Vega fiasco. Hopefully it can compete with the coming 1180.

Vega was Raja's doing, since he left they brought on a better team to replace him
This isn't the AMD of old user, Rory's gone

They're already getting ready to sell Vega 7nm by the end of the year, but only for machine learning and I think enterprise.

Attached: gpu_to_2020.jpg (2618x917, 254K)

Yes, machine learning and workstation cards, not consumer, not Navi

it's very concerning when AMD says that Navi design is still "on track", leads me to believe we wont see 7nm Navi until mid to late 2019, whatever the new "next-gen" non-GCN architecture is will probably fill the high end if the rumors that Navi are a midrange Polaris type card are true

Yeah but Raja was working on Navi right until he left, and not a whole lot has changed since he left.

t. AMD engineer working on Navi

horsehit i had a an evga gtx 1060 suplerclocked and bf4 reached 115fps + when all on low running 3440x1440 (higher than "1440p" which is used to talk about 2560x1440) and when all on ultra 50-60fps, so a gtx 1080 mostly certainly run 2560x1440 well.

>you can have multiple GPUs without crossfire
tell me more, I am curious about this. Short story, I used to run a two monitor setup with two monitors back .. in the last century. Or early this century, long long ago anyway. I remember that I could not drag windows between monitors unless I used a X extension called Xinerama and that was really really slow.

How does this work today? Can I have one screen attached to one GPU and one or two attached to the second and drag windows between them or does X still have this kind of limitation? I am sort of guessing it still does since the window on GPU1 would be in memory of that GPU and just dragging it to GPU2 would require some code to shift that over.

Yeah, and the next Nintendo Console will destroy everything
t. Nintendo Engineer

I am absolutely sure that the next line of GPUs out of AMD will be a series of die-shrink VEGA GPUs. The kernel code and MESA refers to current Vega as VEGA10. AMD has slowly put VEGA12 and VEGA20 support in place over the last few months. VEGA20 looks like some kind of non-graphics focused compute/machine learning/that area card. VEGA12 on the other hand looks like the exact same kind of small adjustments to a pretty identical GPU that were put in place between RX 4XX and RX 5XX GPUs. There's been zero press articles and zero AMD statements as to what VEGA12 really is but the driver code strongly hints that it's a replacement for RX560/570/580 and/or Vega56 and Vega64.

It's actually odd that new Vega-based GPUs aren't released or even talked about given the time between RX5XX kernel driver code and the availability of cards.

Lower your settings a little. I play at 1440p on medium settings at 60fps on a fucking r9 270x

Is this the fabled 580X/590 that's been rumored? I'm gonna laugh if it's another cut down Vega like how the 7870 XT was a cut down 7950, but I'd be ok with it

t. former 7870XT user

Yeah, except I'm not lying

Attached: 1522921835749.jpg (3792x5056, 2.26M)

There is no polaris refresh, 12nm vega, or 7nm vega coming for consumers

The only GPU AMD has plans to bring for us anytime soon is Navi sometime in 2019

It's not 12nm, Lisa already said they skipped 12nm in favor of jumping straight to 7nm
And they don't have any consumer samples of 7nm Vega ready yet
If it's anything it's probably Polaris with more CUs

Ok then, what card is this user talking about ?

#about #to #get #fired

user is wrong about it being a compute card, though we do have some stuff like that in the pipe.

Vega20 is consumer focused. I've literally never heard anyone talk about Vega12, but it's definitely consumer focused as well since that's not the name of any of the professional lines we are working on

I don't say anything that isn't already leaked or public info.

Then what was the name for the 7nm Vega workstation card just announced at Computex?

If you read AMDs press release on the matter (amd.com/en-us/press-releases/Pages/pushing-boundaries-for-2018jun05.aspx), it sounds like it's Vega20 for workstations.

There is no Polaris refresh coming, none

It's not necessarily a refresh if it's adding CUs, it's would be a bugger Polaris card
Are the rumors true that Navi is essentially scalable Polaris?

It's not.

it doesn't exist user, these are baseless speculations

nobody knows anything about Navi except that it is 7nm

Completely new arch? I thought it was just next gen GCN

it is the next gen of GCN. I don't work graphics core so I can't say much in that regard, but it's closer to Vega than Polaris

So nothing before 2019 for consumers, correct?

Can't say anything that isn't public knowledge, but the two main things to look forward to are Vega20 and Navi10. Go ahead and check those release predictions

>The codenames for the upcoming Vega 20 and Vega 12 GPUs were mentioned in the latest LLVM and Clang compiler patches. Codenamed GFX906, the Vega 20 will support native AI instructions and the patch information reveals that it will benefit from the 7 nm tech, while the Vega 12 codenamed GFX904 is rumored to be replacing the Polaris series (Radeon RX 580 and co.) later this year.
Sound about right? Vega20 for workstations and AI/machine learning and Vega12 for consumers?
notebookcheck.net/Mysterious-AMD-Vega-12-GPU-info-leaks-out-plus-Vega-20-AI-features.302274.0.html

I thought Vega20 was for consumers, but I don't follow what other teams are doing.

definitely sounds like Vega 12 is gonna be cut down Vega if it's going to replace Polaris, or Navi is just that much better than Vega and Vega 12 will surpass current Vega
>I know you can't talk about anything that isn't public, just speculating

that's not really how it works. it renders alternate frames, yeah, but they're not guessed. you're getting real, new frames, but with an extra frame of latency(per additional GPU). also any optimizations that reuse anything from older frames will be disabled or they'll reduce mgpu scaling.

Navi is going to be better than Vega. That much is certain. They haven't released any internal memos saying where it will compete with Nvidia though. Take that info however you choose

Doesn't matter to me, I don't like Nvidia's business model or practices, so I'll be buying AMD unless Intel comes up with something worthwhile for the money, but with Raja running it, I doubt it
I like the Radeon software and the drivers have done nothing but improve since I got into building in '12, so I'll probably end up sticking with them

kingdom come runs beautifully on my r9 280x in 1080p. So a 1080ti should able to run it 120fps at 1440p? I wonder whats going on there.

okay. Thanks for paying my salary.

I feel the same way, but ironically I've never bought an AMD card before. We have an employee discount, so I'll probably buy Navi when it eventually comes out

Well I might have contributed to your earnings when I bought the 580, but I doubt I did when I bought my 7870s lol
Honestly, there really isn't much of a difference between the Nvidia cards and the AMD cards when it comes to performance
Besides, I don't run everything at fucking max because I like smoother framerates over minute details that YouTubers act like are the end all be all, and the drivers haven't ever crashed on me in a Windows environment, unlike Nvidia's

rx580 user.

Zero issues, stable fps at 1440p, but I prefer to play most games at 1080p, as nVidia shilling game companies mess with AMD performance since forever.

Using an all-in-one PC.

Ryzen 7 1700 / 32GB DDR$ @ 2666Mhz / RX 580 replaced SSD with with 1TB SSD came with a 2TB HDD @7200.

Preforms better than my old i5-3330 / 8GB / 2TB HDD @7200 with a GTX 600Ti ( Becaus I thought I should give nvidia a chance. Never gain. )

&

My i7-6700k / 16GB / 2TB HDD @ 7200 with a RX 470.

Nice to have an all-in-one pc for once. Always loved the looks hated the weak ass gpus.

well I can't say anything about that. The only AMD card I have is an engineering sample 560 that isn't recognized by catalysts installer. Seems to work on Linux though.

DDR4* 660ti... before lrn2spel spergs show up. Dumb ass reddit bots.

What's your rig setup like?

██████████████████ ████████ [REDACTED]@Chisa
██████████████████ ████████ OS: Manjaro 17.1.10 Hakoila
██████████████████ ████████ Kernel: x86_64 Linux 4.17.0-1-MANJARO
██████████████████ ████████ Uptime: 1h 58m
████████ ████████ Packages: 1637
████████ ████████ ████████ Shell: zsh 5.5.1
████████ ████████ ████████ Resolution: 1920x1080
████████ ████████ ████████ DE: Xfce4
████████ ████████ ████████ WM: Xfwm4
████████ ████████ ████████ WM Theme: Redmond
████████ ████████ ████████ GTK Theme: Redmond [GTK2]
████████ ████████ ████████ Icon Theme: Chicago95
████████ ████████ ████████ Font: Sans 10
████████ ████████ ████████ CPU: Intel Core i7-5930K @ 12x 3.7GHz [46.0°C]
GPU: GeForce GTX 660 Ti
RAM: 4039MiB / 24006MiB

Vega 20 is the radeon instinct card, not for consumers

There is no vega12, it was removed from the roadmap at CES

660Ti? Bro, come on now, say you need to borrow a 580 for "research purposes"
You'd be kind of helping me out anyways because I'd love to know what Manjaro freaks out on my Raven Ridge + 580 Nitro+ build

So nothing until Navi? That sucks

No

i'm not sure if i can hold out much longer guys, i bought a 1600 last year, but couldn't afford a new gpu, i'm really tempted to pick up a 1070 in the next week because they're a sale in my fav shop

seeing as i play at 1080p it should be more then enough

I do presilicon

A 580 would be more than enough, but if the 1070 is a similar/better price, get the 1070

Absolutely no problem on high settings. Don't fall for the GPU marketeers trying to upsell youi

t. using WQHD 144Hz monitor with a R9 390

SR-IOV on gaymanning cards when cunt?
seriously I'd take SR-IOV limited to 2-3 instances even

I've said it about 5 times ITT, are you retarded?

>I honestly can't tell the difference between 30fps and 60fps

nigger what