5700xt vs 2070 Super

Why does everyone act like the 5700xt is comparable to the 2060 super, when it benches about equal with the 2070 super?

Attached: image.jpg (1854x1281, 661K)

Other urls found in this thread:

youtube.com/watch?v=WxVrS5AHEIA
twitter.com/SFWRedditVideos

Because buying a used 1080ti is significantly better than dealing with garbage ass drivers and housefire tiers of power consumption.

Anyone who buys an amd GPU is actually a fucking moron.

Used 1080tis go for over $600, which costs more than both options.

Seems like the idiot would be buying an old used GPU for more than a new card.

1080ti is literal boomer tier in marketplace.
>I know what i have
>No tire kicker
>Runs and drive but no test drive

And every card on ebay that “not used for mining” is absolutely some boomer selling off their stock after the mining bust

becuz userbench is retard

Because they both MSRP for $400?

it's 2060 vs 5700, 2060S vs 5700XT, 2070S vs 5700XT 50th, 2080S vs VII, and 2080ti has no competitor

I ended up saving on my build and am considering getting the 2070 super over the 5700xt.
Am I really missing out on this card when the custom cards come out?

>housefire tiers of power consumption
LOL

Attached: Power-1.png (1335x1212, 72K)

Im in the same boat, but it seems that buying a 2070 super over a 5700xt is just spending more money for the same performance

The xt can outperform the the 2070 super while using less power. Although, nvidia is still probably a gen or two ahead in power efficiency on the same manufacturing node.

Attached: FC.png (1335x1212, 74K)

I don't know if I'd call that a fair comparison since you're leaving out the 2070S' ability to do hardware raytracing, something the 5700XT cannot do.

Sure, if you don't plan on doing raytracing then the 5700XT might be the better pick, but it's still a feature in favor of the 2070S

I'm more worried about thermals
I'm using a mini itx build and that shit reference blower is trash

A hundred more dollars for a feature the card cant run properly isnt a "feature". If you care about raytracing wait a year or two for amds implementation.

Also his post was about performance and wattage not fucking raytracing.

>Used 1080tis go for over $600, which costs more than both options.
What country is this?
MFW bought two 1080tis used for $600.

>check local listings for 1080ti's
>600-700€
>when new 5700xt's are in the low 400€, and 2070S high 400's

Attached: 1401493628684.jpg (440x348, 25K)

Spotted the AMD fanboy

>and 2070S high 400's
it's 500+ user.

would like to say: fuck nvidia
recently replayed arkham knight, and when some objects not in strict focus they lose all shadows, I should've installed recording tools just for that
why am I bitching about it? because when I played it on 390 in 2015 it didn't have that problem. I had my doubts about other games, thinking i'm imagining it, apparently not, now I see it everywhere
here is your answer to "5 gens ahead in performance"

Attached: blast from the past.jpg (600x480, 28K)

lol GOTTEM amirite nvidiabro?

Attached: 1550499897961.png (650x650, 13K)

Bought a used 1080Ti for $400 a year and half ago. Still under warranty for another two years.
The real idiocy is buying either of the ~$500 cards both of which still lag slightly behind a 2.5 year old $700 card and thinking you're somehow getting a killer deal when in reality you're getting ripped off.

but it needs a power plant to run

If 650W is a power plant then maybe.

>it's 500+ user.
Disregard me, was looking at regular 2070's
Still, much lower than what people ask for 1080tis over here

it's more like between the 2060s and 2070s. the only way it can beat the 2070s is overclocking it so hard that it will probably break in a few months.
personally im waiting for 5800 to upgrade from my 580

You realise that's like, less than $15 a year in electricity? Do you also autistically compare power usage of different monitors?

>Paying a $100+ premium to lose 30% of your framerate only on reflections that you never look at.
Performance impact is drastically higher when games start using raytracing on more things and by that point (2022+), the 2070 Super will be a low end card if you want to use raytracing.

It's better where I live. 1080ti was $930 USD here on release with the exchange rate that it was at the time, the exchange rate is worse now but the 5700XT sells for $430 USD here (tax included for both). Sure it's slightly behind a 1080ti, but it's also under half the price it was.

>buy these used mining rigs, goy
sure thing, you fucking kike.

You don't think people realise this? 1080ti's 2ND HAND are literally the same price, or more expensive than the RTX 2080.

Built 2 AMD system and the third is on the way, but sure I am the fanboy here.

Even if those results are correct, please don't use that website as a reference anymore.

Built 2 AMD system and the third is on the way, but sure I am the fanboy here.

>overclock card
>it consumes more power
how could amd do this

>mining cards are bad
>cards used by retard gaymurs are good
stop this meme

2070S cost 100+€ more than 5700XT, they are not the same price anywhere in the world

Anyone who buys an nvidia GPU is actually a fucking moron. And buying a used novideo GPU at the price of equal new GPU is even more moronic.

Wasn't far cry 5 an nvidia game? How come new dawn performs better on radeons?

>Cards pegged at 100% for YEARS are the same as cards that are used for a few hours a day to play games.

Yeah I'm sure that will have no affect at all on lifespan.

Cards in mining rigs generally run underclocked, while gaymurs generally overclock them (and most gaymur editions are OCd at stock to begin with.)
Running a card for two years at 50°C is less harmful than running it for two days at 70°C. The only thing the former can realistically wear out are the fans.

Remember how nvidia forced GPU-base physx and now everyone runs it on CPU because of how many unused CPU cores there are? It's going to be the same with ray tracing. Nvidia cannot even handle real raytracing, instead they implement an approximate reflections shader that utilizes their AI bullshit that has no use in gaming. I know that AMD patented hybrid ray tracing solution that they will use in consoles, but I don't think there is even a need to bring it to PCs. Once AMD brigns 4-way SMT to desktops their 16 core CPUs will crunch on software-based DXR with little to no performance impact. The housefires it will cause will be amazing, but I think it will not stop AMD from doing it.

cheapert 2070s is 150€ more than cheapest 5700xt here

>look at me I am completely retarded and know jack shit on how wear and tear of electronic components work


nigger aslong you don't drive those cards at near max temp the transistors etc. don't give a fuck, specially mini cards are normally cool as ice.

Cards that come from gaymurs are about 100x more busted and broken since the constant load and idle cycles actually wear down the card because you constantly go from cold to hot, hot to cold up and down all the time this hurts cards, cards that just run at a stable warm temp like 60/70°C don't give a fuck because the are not stressed by temperature changes.

Attached: 1563278811491.png (645x729, 81K)

Moving parts (cooling) fail faster when mining, everything else is pretty much irrelevant.

Would be great if Nvidia let the AIB makers use the old stock of RTX 2060 to create a 12gb model.

I would buy one.

>Nvidia cannot even handle real raytracing

HURR DURR

youtube.com/watch?v=WxVrS5AHEIA

Like I said, RTX is not ray tracing.

My understanding is that the 2070s and 5700xt are similar stock with more games being optimized for the 2070s, but the 2070s overclocks much better. Right now I'm leaning towards the 2070s for that reason.
Will AIB 5700xt's have much of a difference for overclocking potential or will they just fix the weak cooling?

It seems that you are completely disregarding that 2070S costs 1.5x more than 5700 XT.
Anyway, looking at power consumption graphs, there is not much to gain from overclocking. 2 extra frames for 100W more power consumption isn't that great. But $400 stock non-reference 5700 XTs will offer great performance regardless. $600 2070S can't compete.

>2080 Ti
>52 FPS in the first room

RTX ON KEK

>1.5
I thought the 2070s was marketed at 500-520, which is in the 25-30% range, and the current market prices were adjusting for demand or accommodating the regular 2070 as it was phased out.

>[moving goalposts, the post]

>it uses so much more power
>*shown that it uses less than the competition*
>yeah, but so what if it uses less power?!

My bad, I thought you were saying the 5700xt is a housefire.
Either way, power usage in GPUs is irrelevant since most of them use so little.

I dunno how it was marketed, but the cheapest one on newegg was $585 last time I checked.

What's the fucking verdict?

Wait for custom fan 5700xt or go for 2070 Super?

Attached: 1559171597969.png (600x600, 464K)

Miners undervolt their cards. Whereas cards used for gaming are overclocked and ran for 10+ hours a day.

>when some objects not in strict focus they lose all shadows, I should've installed recording tools just for that
Yeah you should.

Nvidia is well known for doing scams like this, could be they are at it again.

Because it has performance of the 2070 for the price of the 2060

5700XT AIBs are a week or two away. I'd at least wait until then to see how they do

If you want the best value right now, then custom fan 5700 / 5700 XT.
If you need to meet some performance goal that's higher than 5700 XT, then wait for big navi.
If you need to meet some performance goal that's higher than 5700 XT right now, then 2070S/2080S.

Most miners are not as smart as you seem to think of them and they generally don't care about such things. And most gamers don't overclock hardware either.

>Most miners are not as smart as you seem to think of them and they generally don't care about such things. And most gamers don't overclock hardware either.

I'm pretty sure most of the "miner cards are perfectly fine, goy! even better than cards used for gaming!" stuff is unironic shilling from buttcoin miners on Jow Forums trying to offload their shit

AIBs would bring it on par/slightly better than 2070 super but I reckon that it'd guzzle more power. A little disappointing considering superior process but it means that 5800/5900 will actually go toe to toe with nVidia offerings. The 5700/XT have a great VCE block and I think they can still be HPC monsters despite being midrange, so it'd be a more interesting argument when the driver shapes up.

Because this world is full of nvidia shills who are apple fanboi teir gay, and they glady love getting fucked in the ass and paying $100 more for only 10% better performance.

Theres a never ending stream of disinformation about amd GPUs and people compare the 5700xt to the 2060s on purpose to make people think theyre comparable cards even though the 5700xt is much faster.

it's weird how almost nobody compares numbers across multiple testers
heck, people make purchase decisions based on average "of x number of games" graphs instead of doing proper research

AMD has just a better offering suck it intel fanboy faggot

> benches
In userbenchmark, lol?
It should be a but above 2070, but knowing how abysmal drivers are, I won't expect much. In Hashcat GPGPU tests, 2060S is ahead of 5700XT.

Selling raytacing as a feature is bullshit. They did manage to speed it up 2x (according to Octane pathtracing benchmarks), but that's still 1/10th of what is required for real time lighting. It's enough for limited reflections and fuzzy shadows, still worse quality and shower than approximate techniques.

The real technology to focus on is machine learning upscaling. It's the next pre-baked lighting. Record the game at 8k, scale it down to 2k, train a neutral net to recognize the difference, then when someone plays at 2k it understands what it's supposed to look like at full res and upscales seamlessly. Nvidia's current implementation is still kind of blurry, but they claim they get a 50% performance improvement at 4k through it.

However AMD's Contrast Adaptive Sharpening seems to do a better job using a general algorithm, so I could be completely wrong.

Who's making those AIBs 5700xt?

everyone

>memetracing
>worth spending extra hundreds
no one that owns a meme tracing card will even turn it on to sacrifice fps for a token gimmick that he wont see in-game unless he specifically focus on it

DELIT THIS. FAKE REVIEWS

$100 for ray tracing is too much since there's hardly any use for it in games. Meanwhile AMD's RIS is much more noticeably better than DLSS garbage.

True, at best raytracing is just going to be something people enable for a couple seconds so they can take screenshots for upboats and reddit gold, then go back to playing the game at bearable framerates.
Sort of like back in the day people who would mod skyrim to the brim with a fuckton of texture packs and ENB shit that would make their game chug, but they'd only enable any of that just to take some screenies.

>Hashcat GPGPU
OpenCL hardly works on RDNA series at the moment; whoever is on driver detail for compute is AFK as far as we know. At the moment it's a gaymen and creator card and more of a paperweight for HPC.

>userbenchmark
Make up your mind, is userbench good or bad?

Show proof

Novideo shills don't even know shit about radeon experience so I'm not surprised they are unaware how shitty novideo drivers are.

Usually it's sapphire, msi, asus, gigabyte, powercolor, his, xfx, and I think asrock also joined GPU market. Xfx has the best track record in the last years.

As far as features are concerned, Radeon image sharpening is much more practical than the RTX meme.

Dlss is pure crap.

>clock gpu beyond its viable peak perf/watt ratio knowing its intentionally bottlenecked by its limited CU's and ROPs
>anons it uses so much power at this OC setting on a tiny mid tier 251mm die
>actually paying more than $300 on a die that should have been intended for the 200-$300 market
avoiding the stupidity of expecting more from these cards, can we all acknowledge how stupid you have to be to buy these fuckin cards, AMD is clearly not caring to compete against Nvidia anymore and wants to inflate the prices in the GPU market.

Fuck all of you, fuck AMD, and Fuckin Nvidia, also fuck intel, because kikes and shit.

Attached: 1359157709071s.jpg (114x124, 4K)

I will never understand how an AMDrone can be mad at Nvidia.

If it wasn't for Nvidia and their Super lineup, you would be paying $30-50 more for the 5700 and 5700 XT respectively. If it wasn't for Nvidia utterly dominating AMD at the top end of the market, it wouldn't force AMD to carve out their niche in the market for low end and mid range cards with the RX 570 and 580 at affordable prices. AMD's biggest customers are poorfags and Nvidia forces AMD to be the perfect company for poorfags.

What the fuck are you even complaining about? First, the graphs clearly show that card is clocked at the sweat spot were dumping more power doesn't increase performance by reasonable margin. Second, the card totally obliterates it's competitor, which is RX 2060 super, so it's nvidia who doesn't want to compete. You can't blame the winner for not competing, that's fucking retarded. Blame yourself and your stupid novideo mindshare. This is the future you've chosen. AMD will never lower prices anymore because you and your fellow novidiots refused to buy AMD cards when they were super cheap and superior to novideo garbage fermi and kepler which almost led to AMDs bankrupcy. And now you blame AMD for not competing when they offer more for fraction of competitor's price yet again. Fuck nvidiots and fuck you in particular.

NVIDIA is the only brand that has literal set a housefire.

Still no official releases release date?

what the actual fuck?
>housefire
my RX570 performs about 5-10C less than GTX1060 on normal loads
>garbage ass drivers
thats right, nvidia drivers are utter shit on linux while amd linux ones work better than windows ones
this
only if you use game less than 18hr a day (unironically some people do it, thats how price consumption charts get inflated)

I just switched from AMD to Nvidia (r9 290 to GTX 1070), and I'd say AMD has better drivers straight up. Only because AMD has that chill mode fps capper while Nvidia has no equal. AMD's control panel is a fucking decade ahead of nvidia's panel. My GTX 1070 has the same ugly dumb control panel as my old GeForce 3 Ti 200.

Never had any compatibility problems or ran into games that worked better or only on one or the other. Both just work at their respective performance level without any problems.

There's separate thread for it Nvidia's counterpart is fastsync and control panel is irrelevant. 1070 a shit though. I got micron version which makes it even less stable, every driver update is a clusterfuck, even the driver installer crashes (that's why I don't update at all).
>Never had any compatibility problems
Try running something old like Arcanum, lol. Chances are your nvidia system will shit the bed and you can forget about letterboxing, nvidia just can't do that. Radeon just worked for me though.