Is this for real

I am seriously thinking of buying this shit.
I only use LiGNUx so I can't and will never get a MEME-teh-ex novidia GPU.
How the fuck is this so fucking cheap?
Anyone had experience with MShiet before?
I only buy asrock, xfx and sometimes sapphire

Attached: sales.png (1214x704, 247K)

Other urls found in this thread:

youtube.com/watch?v=uYOXk_znRhw
babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/4/
gamersnexus.net/guides/3446-nvidia-gtx-780-ti-revisit-in-2019-benchmarks
gamersnexus.net/guides/3430-amd-r9-290x-revisit-in-2019-vs-rx-590-rtx-gtx-more
youtube.com/watch?v=IYL07c74Jr4
youtu.be/H0L3OTZ13Os
youtube.com/watch?v=nIoZB-cnjc0
twitter.com/SFWRedditGifs

If you have an aircraft headset, sure.

enjoy your jet engine

I don't fucking care about being loud as long as it's a decent card.

go for it then

UV as heavily as possible. Aim for ~950mV, it's possible to get steady 1,200-1,400 MHz at that low voltage due to how badly overvolted vega is. YMMV ofc.

Power consumption will also drop close to 150W too.

undervolt, flash vega64 bios and you have a 1080 killer for under 300 euro sheckels.

The V64 bios only oc the HBM, the card is still shit

Shit how?
With a little bit of low effort adjustment it matches the 2060 and costs less.

Nah, UV increases perf 10-20% by maintaining higher clocks for longer durations or indefinitely and driver updates have the STOCK vega 56 like 90% as fast as the 2070 now.

OH NONONONONONO
AHAHAHAHAHAHAHA

Attached: Screenshot_2019-04-04-14-49-51.png (1280x720, 321K)

Best GPU for immersive flight simulation

Not too shaby in cowadoody either

Attached: Screenshot_2019-04-04-15-03-25.png (1280x720, 360K)

wtf

Attached: 1501764121976.jpg (640x640, 29K)

Where are 2080 Ti and Titan RTX on this? Are they omitted because they utterly BTFO everything from AMD?

O "the non-faggot" P here,
you just cost me 262 jewros+ 3.90 jewros shipping cost.
I'm gonna call them tomorrow, which is less than 10 hours, in order to ask about the shipping date, because it says "order only", due to the Orthodox Easter going on here in 3 weeks.
No I won't ask them if the price is real, because the retail jews will get angry and will UP the price.
as for the noise, I only care about the original cooler. I honor those designs more than any aftermarket.
I have the 1900 xtx and the HD 4870 with the original coolers.

>msrp $1200
Yeah man because everyone is willing to fork over more than a grand just to play fart night and call of doody.

As long as you UV to some extent you'll be fine. The biggest gripe with blowjob coolers is they tend to throttle max card performance by about 30%. UV'ing recoups that loss is all.

*Battlefield V at 1440p on ultra, at over 100 fps. Those just wanting to play Fortnite would get a RX 560 or whatever.

based

You can already get that with vega 56. AA turned off gets you into the 144 fps range.

t. v56 owner

Attached: 5654665564565.jpg (711x457, 52K)

Attached: 565464565.jpg (711x457, 51K)

>26 thousand euros for a graphics card
are AMDrones really this delusional

Attached: 56546655645615.jpg (711x457, 51K)

Truly is amazing how much driver updates can change. Vega was a fucking joke at launch, AMD may never be able to recover from that.

Attached: radeon_VII.jpg (1920x1080, 186K)

Attached: 565466556456115.jpg (711x457, 47K)

Attached: 5654665564561157.jpg (711x457, 48K)

Attached: 56546655645611572.jpg (711x457, 49K)

Attached: 565466556456115723.jpg (711x457, 48K)

Attached: 5654665564561157234.jpg (711x457, 54K)

Attached: FC5_1080p.png (1328x1222, 57K)

Attached: ACO_1440p.png (1328x1223, 59K)

I don't get it, how can AMD perform so poorly in all these games but curb stomp nvidia in these:

Attached: 1554100085448.png (552x688, 358K)

Not an AMPOO fag but all those games are shit anyway

fucking fud
Even my 390 was good enough for W3 back then. Of course, if you're dumb enough to set
>hairworks = anything
like toms hardware for example, you deserve to pay double
actually the bottleneck was the Sandy cpu @4.5niggerhz and only in citys

one thing people haven't realized yet is that nvidia added parallelism support for compute + graphics with volta architecture and turring continues that trend. why RTX cards do so well now compared to pascal and maxwell generation cards in async support titles now.

Attached: SB_1440p.png (1328x1223, 61K)

because toms hardware benchs
>(((Ultra presets)))
and those are usually biased as fuck, especially on gimpwork titles

yeah it is weird. some games amd just curb stomps but then all these other plethora of games amd gets curbed stomps back. especially indie titles. i play a lot of indie games and amd is just garbage. especially indie games that are built off of UE4. like conan exiles. i couldn't even play space engineers anymore when i had my vega 56 but since i moved to my 2080 its like a day and night experience. i had nothing but crashes.

but hey, nvidia has money to pay for driver developers and game developers support. amd doesn't. i guess that's probably why.

Attached: MHW_1440pK.png (1328x1223, 57K)

developers
inb4 microsoft

Attached: Power.png (1328x1222, 62K)

please never speak again. you make amd fans look like retards and it isn't fair to them that raging autistic retards like yourself go out there and represent them.

Attached: Vega64.png (1324x1664, 72K)

Attached: 101923.png (678x400, 31K)

Attached: 101920.png (678x400, 31K)

Attached: 101914.png (678x400, 31K)

You got a folder full of GPU bench results to dumb in any thread, yet call other people Autistic
yikes

what should my voltages and clocks look like on a sapphire nitro+ 580?

also msi a shit, and amd is the better bang for your buck than nvidia when controlling for ethics

alright op i know i posted benchmarks of the 2070 but majority of them showed the vega 56. i wanted to make it obvious that overall, yes there are some outliers where amd just destroys but overall nvidia cards are usually going to be faster in the same teir. but more importantly, the power draw going to be far less. this alone should tell you why you should pick something else over the vega 56. personally i would just grab a 1660 ti. 1160 ti and vega 56 are pretty much on par with one another but the 1160 ti uses so little power. iirc i believe it uses less than the 1060.

if you want i can start spamming 1160 ti benchmarks.

Attached: 101911.png (678x400, 30K)

like i said, please stop posting. you really do make amd fans look ever more retarded than they truly are. you are a toxic parasite in their community thats killing amd and its fan base.

Yeah keep spamming

unlike this retard i'm willing to admit when i made a mistake. i confused the 1660 ti with the vega 56 when i should have said rx 590. the 2060 would be the one to grab over the 56.

Attached: 590.png (1324x1444, 92K)

and here is an overall vs the 56 with the 1660 ti.

Attached: Vega56.png (1324x1444, 91K)

and here is the overall 2060 vs vega 56

Attached: Vega56-2060.png (1323x1665, 104K)

Protip: drivers affect performance, most of the benchmarks spammed here have year old drivers. Not that AMD isn't shit but it's not that bad once you take updated drivers into account. Also because like user at the begining said vega has suffered from overvolting which reduces performance by 10-20% you gain back through UV'ing.

youtube.com/watch?v=uYOXk_znRhw

Attached: 1539660713958.png (1329x1231, 706K)

power consumption of the 2060 vs vega 56

Attached: Power-2060.png (1327x1225, 74K)

>having to undervolt
>so heavily driver dependent
why would you want to get a card that breaks or makes performance every update and have to manually alter its fucking voltage for stock settings? let alone even after undervolting you still have massive amount of power consumption.

and a years old? rtx cards came out during the holiday season 2018. 2060 after that. 5 - 6 months tops. vega has been out since 2017. 20-17. TWENTY-SEVEN-TEEN. if it takes amd over two years to make stable drivers for their cards then they are absolute horseshit.

you forgot one thing.
noVidia does not work on linux.
No-one guarantees that after any update noVidia will run X.
I am having nightmares everytime I shutdown my workstation at work after 3-4 weeks of uptime just to boot with a fresh kernel, only to see the fucking quadro shit fallback to tty1.
novidia has the shittest ecosystem ever. If you run games tailored to their shit, it runs fine. If a bit goes wrong, everything is fucked up.

>things that no one cares about
wew nice argument

housefire

this x4

>comes to most autistic tech board on the internet
>posting benchmarks is a common practice here
>whyAreYouPostingBenchmarks.jpg

You're on Jow Forums calling tech autists autistic. Maybe want to re-evaluate the situation.

t. AMD fanboi.

Maybe want to re-evaluate the situation.

>linux
>i'll take things no one cares about for 1000
wow you have to wait maybe a week for updated nvidia drivers to support X on a bad week for nvidia. majority of the time nvidia has support for the latest X before the newer x11 api comes out. and on top of that, even though amd's linux open source support is fantastic, nvidia is still ironically faster. and jew-sync works splendidly well if we're going to interject for a moment things people don't care about for 1000. far better than the open source freesync support. which i think is still 0 unless you're running RC candidate kernel. which still doesn't even have the finer support of the jew-video drivers. like an overlay in opengl games to tell you when jew-sync is activated.

With modern drivers you can do one click undervolting and in most cases that drops power consumption by ~20% but manual undervolting can squeeze out even higher efficiency.

My take on all this is AMD cards suck but they get better over time. AMD has like 10% the RD&D of Nvidia so of course drivers are gonna be shit on release. Though I personally use a gtx 1070 and don't consider upgrading to vega worthwhile.

Attached: s080.png (371x353, 148K)

Agreed. I also flashed my Red Dragon 56 to a 64 bios to push about 10% faster memory clocks just running at the stock settings. I haven't even tuned the voltage or really found how far I can push it.
He's just pointing out that these sites don't go back and re-benchmark those older games, which could be running a touch better than 2 years ago when they were benchmarked. It's not that the drivers were unstable. The 2060 could also be running better 2 years from now, too.
Also
>massive amount of power consumption
You don't know what massive amounts of power consumption is. I've had two card setups pull more than 400W themselves in years past. Besides, even running stock that system that's referenced in the chart is only using 100W more than the 2060. It's not completely insignficant as far as heat goes but it's also not the end of the world.

>better overtime
>two years later
oh boy, after a two year wait, they finally match nvidia cards of previous generations and lesser tiers of nvidia newer generation. wow, great job amd! really makes my investment worth while!!!
>but but my forza horizon 4 results
but but my nvidia 98% of the rest of the game market results
>undervolting
wow, take a stock card that should run just fine out of the box but nope amd has to fuck that up too.
>20% reduction!
wow, so now its 30% more power consumption over nvidia than 50%. give yourself a pat on the back m8.

>don't listen to the benchmarks goi
>just wait two years ^TM
>even though amd is incompetent, trust us, after two years, they get better!
>just undervolt goi! what? you want a card that didn't draw more voltage than needed? what? you think we're nvidia goi?

Attached: 1548041743181.gif (124x128, 3K)

M8, I'm not praising AMD but they're not as bad as you make them out to be. You're like the guy yammering about how mexicans are taking all the jobs even though white people ironically don't want to cut grass and work on roofs.

and you're like the guy confronted by nvidia 3.5gb but denying it and saying no problem here move along.

>don't listen to the (((benchmarks))) goi
Unless they are brand new games then yes, that's what I'm saying.

>brand new game!?
>don't pay attention the benchmarks goi!
>just be ignorant and live in self induced delusions!
>its all lies!
>just wait two years then pay attention the benchmark for that game as it will take amd two years to make stable drivers for that came that will break in the next update a month later!
>also under volt xDDDDD

Attached: 1521754598852.gif (358x328, 926K)

Go and stay go shlomo.

not really. pic related was done in 2016 but its point still stands today. taken from: babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/4/

from 2013 - 2016 - 3 year time span, both the 290x and 780 ti still are neck to neck to each other. the only "finewine" that amd produced was fixing broken things in their drivers in a few select games and its 1gb more vram. more importantly, from the driver "finewine" just shows amd is incompetent and needs to figure out how to work with developers. the vram should the MOST finewine out of anything.

Attached: Big-Pic-fix.jpg (1746x2203, 1.2M)

here's also gamers nexus looking at the 780 ti and 290x in 2019 in modern games:
gamersnexus.net/guides/3446-nvidia-gtx-780-ti-revisit-in-2019-benchmarks

and

gamersnexus.net/guides/3430-amd-r9-290x-revisit-in-2019-vs-rx-590-rtx-gtx-more

and again, really shows, the vram affected performance more than drivers.

>AA turned off
I sure love me some jaggies

Attached: QaYqtd7_d.jpg (640x723, 30K)

Does this also apply to the RX 580?
I was thinking about underclocking it.

>turn off AA
>turn off tessellation!
>its nvidia's fault for supporting developers and getting them to optimize for their cards and not amd for being incompetent and not support developers too!
>under volt goi!
>don't pay attention to benchmarks goi. just self censor yourself goi!
>wait two years^TM!
>wait for drivers after two years for it can be fixed and then broken again a month later!
>wait for 300 series!
>wait for fury!
>wait for polaris!
>wait for vega!
>WAIT FOR NAVI!
>poor volta XDDDDD
>just wait!^TM

Attached: thenoseknows.gif (501x504, 12K)

>Tom's "just buy it" Hardware
fuck you and fuck your fake benchmarks

Fuck me
I meant undervolt

Yup, both polaris and vega were rushed hot trash heaps on release. By the time people figure that out all 570/580 and 56/64 stocks ran dry from all the asscoin miners realizing how much money amd cards made them at 1v.

To be fair the tesselation thing was true.

youtube.com/watch?v=IYL07c74Jr4

>buy 1080 Ti
>turn most of the settings all the way up
>over 100 fps
>praise the Jolly Green Jew

>get your performance decreased by 20% in driver updates
>prause the Jolly Green Jew

things that didn't happen for 1000

seething

not nvidia's fault that amd didn't take tesselation seriously even when they, as ati, was the first to do something with it. nvidia didn't stop amd from improving their geometry engine. it wasn't nvidia that made amd take till polaris to vastly improve their geometry engine and finally vega getting it up to near equal partiality with nvidia.

The time this did happen it was a bug and got fixed in the next version.

There was nothing wrong with AMD's tesselation tech, it performed optimally when implemented right. The fact that when enabled made planks of wood have 6 gorrillion polygons is really suspicious when the same did not happen on Nvidia cards that also had tessellation. It was done on purpose to gimp AMD cards.

Imagine being this cucked by AMD.

already bought one last week, im going to watercool it of course

>There was nothing wrong with AMD's tesselation tech, it performed optimally when implemented right.
> implemented right
yeah running tesselation at very low levels because amd didn't take it seriously by not thinking developers were going to use it moderately. let alone heavily.
>The fact that when enabled made planks of wood have 6 gorrillion polygons is really suspicious when the same did not happen on Nvidia cards that also had tessellation.
again, not nvidia's fault amd was incompetent in not thinking developers where going to follow amd's belief of not using it moderately or heavily. not nvidia's fault amd is incompetent.
> It was done on purpose to gimp AMD cards.
you truly are delusional.

as i said before, and will say it again:
>it wasn't nvidia that made amd take till polaris to vastly improve their geometry engine and finally vega getting it up to near equal partiality with nvidia.
nvidia didn't make amd decide to take 5 years to get into tessellation.

How much are they paying you anyway?

Attached: 1550425654772.png (1113x887, 528K)

>stocks ran dry from all the asscoin miners
/v/ is still ass damaged about that. Even though the prices being inflated so high was ram price fixing by the suppliers. It honestly makes my day thinking about the butthurt that caused.

>It was done on purpose to gimp AMD cards.
i just want to focus on this bit from your autistic rant but really dude, be honest with yourself. can you fault nvidia wanting to promote a feature that nvidia can operate at fantastic performance? you know for a fact not only would amd do it, but THEY HAVE done it. recently look at the whole async compute debacle. amd marketed that hard and got developers they where working with to adopt it to rub it in nvidia's face and use it against them. back when amd gpu division was ati they were doing it in the mid 2000's ironically with ati's third party tessellation api with games like tomb raider. really dude give it a break with your conspiracy bullshit. amd got caught off guard, amd bet on the wrong path for whatever reason and nvidia was able to capitalize off of it. and honestly another one you can toss in amd with freesync as it made nvidia toss in the towel and support adaptive sync and thus freesync. which ironically hurts amd as now its another reason go go to amd anymore. that one kinda backfired on amd. but before nvidia did amd was marketing hard why its better go to freesync over gsync.

how much is amd paying you for being a shill? you are truly brain dead retarded in thinking it was some mass conspiracy by nvidia that caused amd to be brain dead retarded like yourself to not adopt and focus on tessellation even at moderate levels. only amd can be blamed for their own fuckup and to add more insult to injury, wait 5 years to even get to almost equal playing field with nvidia in geometry performance.

Red pill on Nvidia youtu.be/H0L3OTZ13Os

>jet plane tier noise
>actually using blower
why

>Cheap card
>Okay performance
>it vents heat out the back
Other than noise what's not to like?

and another youtube.com/watch?v=nIoZB-cnjc0

because
>literal who benchmarks
and
>dirt rally, dirt 4, dirt rally 2.0
which all favor AMD, but an exception is not something you buy a gpu over.

What I mostly hate is how they sell inferior product and then it's up to the consumer to undervolt and fuck around with the GPU to get decent performance out of it while keeping it form being too hot.
And if it doesn't work then "hey goy you are taking the card out of spec, so we are not responsible for any performance you though youd get, but didn't"
It's just so fucking annoying to sell lottery tickets in GPU form and have people hope their chip is good and that after all the manual work, that it performs good.

>less efficient at cooling and runs hotter
>is much, MUCH louder to the point of being inhumane
>whats the problem
I would not get a blower card even for an enemy.