Who here use AMD both CPU and GPU?

Who here use AMD both CPU and GPU?

Ryzen 7 1700 + RX 570 here.

Attached: amd-logo-59E15B5EB9-seeklogo.com.png (265x300, 12K)

Attached: Screenshot from 2018-05-16 10-45-37.png (1444x952, 137K)

I want a Vega 56 to replace my 1050ti in my 1500X system (I hear async compute really helps in certain titles, though I will miss Nvidia's OpenGL multithreaded drivers) but prices have remained stupid high for what I think is the best card in AMD's entire product stack. 90% of the performance of the Vega 64 (or more with UV overclock and some luck) at 70% of the power draw, sign me the fuck up, but not for more than 450 eurodollars.

>manjina

Why
>update a package
>forget a new dependency update, not compatible with older one
>breaks

never happened to me. what package?

Ryzen 5 1600 and RX 480 here.
Used a Phenom II X4 955 and a Radeon 4850 for ages before that.

2700X + RX 480
The ultimate Linux combo

Why would you do that?

Attached: Piriform Speccy.png (718x652, 117K)

1600x and R9 290

My god this gpu is still relevant. How Amd does it?

r5 1600 + r9 290

Attached: 980x.jpg (420x420, 31K)

Phenom II X6 1090t here + rx 560

1700 + r9 290

Hawaii is eternal. And hot, very hot, I need to sort my cooling out since I moved into an matx case.

the fun will start at late 2018.. if the navi is as good as vega on mining the prices will explode from day 1

1800X + Dual 290X

R5 1400 + RX560 here

This might be the best GPU, in terms of lifespan, ever made. I can run a ton of games at 4k, I play racing sims in 3x1080p, and the frame rates are damned fine to boot, once you turn off goyworks and useless shit.

Phenom II X4 955 + RX 460

Fx6300 and r9 270x, it does OK, and keeps my feet warm.

FX-8350
HD 7850
feels bad to be poor

FX-8350 and HD7750 reporting.

Currently saving for a 7970 to achieve the ultimate 2012 AMD gaming PC.

Attached: IMG_0215.jpg (4032x3024, 2.31M)

2700x and a 580.

Attached: IMG_0218.jpg (4032x3024, 2.07M)

2700X and V56

Attached: 65464582345.png (1920x1080, 1.28M)

1600x + r9 290x (MLU Bios)

Attached: DSC_0023.jpg (2992x1683, 1.39M)

1400 & 270x. Soon to be 290.

Currently have a 290X with r5 1600. I am planning on upgrading to a 1070 or 1070ti though, as I've been quite disappointed with my amd GPU.

>Amd + amd
Hahahaha no thx

i'd get an AMD GPU if the prices weren't inflated so much just because they mine well

R3 1300x + RX550 4GB

It runs almost all games on highest settings at 1080p. Too bad Asus fucks up their budget card, unless I replace the cooler it runs like a leafblower from boot untill shutdown

That's a terrible combo go with AMD Nvidia and get the best overall performance for the dollar on both sides.
No one should buy a fucking vega for 3x the original MSRP or go full retard on an intel cpu to gain 5% more performance by wasting double the cpu money.
Stop being fanboys and reach real enlightenment.

290x is still a very good card whats wrong with it?

Not hating it but the TDP is kinda high, (coincidentally 290w). Still considering a 290X, 7970 GHz, or a GTX 970 depending on the money for my PC.

Attached: 1523433903969.jpg (1038x1000, 140K)

2400g + 580 8GB Nitro+ here
It..it feels right guise

Attached: 3.jpg (410x226, 34K)

fury x ( lol I know) and 4790k here

2500k @ 4.7 ghz and r9 390

high power, loud, high temperature, and I've had some issues with GPU drivers crashing my computer.

You sure it isn't your paste I had that issue with crashing on my 390 might want to check that out. Once I replaced my paste no more crashing.

Never go full AMD. Bought a 2700x and now I'm waiting for The 1180. Last build for myself I got an 8320 FX and 280x GPU and I'm never going full AMD fucking ever.

Yeah I'm pretty sure it was drivers. Couple of months ago I updated my drivers and was trying to use the eyefinity thing to spread out a split screen game over two monitors, but as soon as I hit apply my PC would crash. Had to boot into safe mode and install an older version of the drivers.

How well does the RX 400/500 series run on Linux compared to windows? I was thinking of buying one of those but from what I've experienced the R7 300 series runs like utter trash, performing 80% worse on any drivers. Yet nVidia cards run equally well on both systems, did AMD fix their shit?

>AMD GPU
It's not 2012 anymore OP.
I'm thinking of upgrading my 4770k to a 2700x, cause I use a 1600x with a 1050 Ti at work and it's awesome.

mah niggas

Attached: speccy ryzen.png (879x604, 43K)

Ryzen 7 1700 + RX 470 here

Phenom II X4 955 BE with a HD5770 here

What should I upgrade to?

Just get a RYZEN 6 core with a 1060.

Had a Phenom II X4 955 and HD 5770, then upgraded to HD 7870, FX-8350 and briefly replaced the HD 7870 for a R9 285.
Currently on a 1600X + RX 480, I regret nothing, outside of having a motherboard that couldn't handle the FX-8350 properly when fully utilized.

A8-6600K + HD 6670

>62C

Attached: 1525270869545.jpg (2048x1536, 97K)

Huh, i have exactly the same.

Feels great and works great.

2700X + 2 x Fury X

Using an i7-3770k (running at 4.1ghz, watercooled) and an r9 390. They actually go together quite well

Cheaped out and got an i5 4590, when I could have gotten a 4690k. R9 390 is great, and glad I didn't go for the 970. I'll wait till next year to update to Zen 2, but that 2700X is really tempting to buy right now.

1700+580

Waiting on the 2400GE to hit, but I might grab (((intel's))) 8400T if it doesn't launch soon.

the GTX 1150 might be another option for dropping my temps further.

I've got a Ryzen 1600X with a RX 470 8GB connnected to 3x 4k on my desktop and I've also got a HTPC with a 2400G and a RX 560. They seem to run just fine with the free drivers (haven't tried AMDs closed source binary blob). I've never had Wintendo installed on either system so I've got no idea how the free driver compares to their binary blob running on that.

1600X+7970
Still better performance than a new 1050Ti, 7 years later.

Will buy 7nm Vega if it comes to consumer with 1/2 fp64 rate.
Bit upset at myself for not buying an RX470 when you could get them for $100.

AMD only just this month finally fixed the 390 bugs in Linux drivers.
Tahiti, Polaris, etc, were all fine. Vega, including Raven Ridge, seem to be fine now as well. Not sure why Hawaii had such poor Linux support for years.

Post processer charts, Going to build a new PC soon and I wana see what I can work with. I thougtht there was some craze going on earlier about ryzen beating intel. anyone know anything about that?

Please tell me how you were able to run a live USB on your 2400G box please, nothing has been working
2400G and 580 Nitro+ here

what's with all the x70s?
They're usually what, $20 cheaper than x80s, the same size, and the same temps?

Ryzen 5 1600 and HD 5870 here.

How hot does your 2400G get under load, and what's the cooler rated at?

Assuming the iGPU is disabled when the 580 is in.
There are absolutely no benchmarks online showing how it does as just a CPU by itself.

Huh?

Normally 60 to 70 percent of the cost for 80 to 90percent of the performance. Its always been the case. Pro compared to xt, xx50 compared to xx70, non-X to X models and now 70 to 80.

I haven't tested it like that in Windows yet, I'm assuming the iGPU is off in Win 10 because I have the Adrenalin drivers for the 580 installed
I have a Corsair H80i V1 that pulls in and where it pulls in from is right in front of an a/c vent so thermals shouldn't be an issue
My problem is that I haven't been able to get a Live USB of KDE Neon, MX Linux, PCLOS, and I'm about to try Tumbleweed Live, Salix, and Clover OS to see if any work
It seems when they try to compile they see the iGPU and freak the fuck out if they're not on at least 4.16(?) Linux Kernel is what I'm guessing

*haven't gotten a live usb to work yet

>top of the RX580 8GB should only cost 259 max
>can't even buy any 4GB model for that much

Attached: 1506180747698.png (1000x1000, 47K)

Usually the case is that the BIOS will automatically disable your iGPU if it detects a dedicated GPU.

Then what's causing it to freak out? Pic related is when I tried to run a Live version of Mint

Attached: IMG_20180515_101105.jpg (4096x2304, 1.51M)

2700x and rx580

Attached: amada12.jpg (512x483, 222K)

Attached: file.png (588x417, 41K)

i bought a vega 64 thinking hbm 2 would have lower power draw than gddr5. Its a 300 watt card what is amd doing?

The Vega 64 is the black sheep of the current generation.

Main HTPC:
Athlon X4 845 + RX 460

Dev desktop:
Phenom X4 9650 + HD 7750

Seedbox:
A8-3800 (do APUs count?)

Before being called a fanboy, I don't have anything against Intel from a tech perspective, it's just their pricing even used. I had a number of bad nvidia cards in the past, so no thanks.

R5 1600x + RX 580

R5 1600X and Vega 56 here.

I stick with Speccy prior v1.3 because of pic related.
>AMD Processor AMD Processor AMD Processor AMD Processor AMD Processor AMD Processor AMD Processor ...

Attached: AMD Processor...............jpg (752x706, 114K)

FX 6300 and R7 260x for ever

AMD FX forever desu.

Attached: 8350.png (454x443, 49K)

First of all, don't disable the internal graphics on the 2400G in BIOS. I have no idea why but with my ASUS motherboard the IOMMU on the CPU gets disabled which results in problems booting up. Secondly, you're right about the 2400G needing a pretty darn new kernel. Fedora 28 has one that's new enough on it's live-CD. I haven't tried installing anything else since I use Fedora and that .. worked.

Which Asus board do you have? Are you suggesting I take out my 580 and try to get it to boot that way? If you read the pic I've got the TUF mATX board, where in the UEFI do I enable/disable the iGPU and how do you make sure IOMMU isn't disabled?

B350M Prime, it's essentially the exact same board you have minus some LED lights and a fan header. Uses the same BIOS. Though I can tell you with absolutely certainty that kernel 4.10 (your screenshot) won't work with the 2400G no matter what you do. First kernel that somewhat works is 4.15. First kernel that doesn't have real issues (as in actually works) is 4.16. I realize this is pretty darn limiting, your options would be to install on some other box and upgrade the distro (hopefully to 4.16) or use a live-CD with a new enough kernel.

I'm thinking it's gonna come down to using a Live distro with a new enough kernel
It's been fucking crazy, I thought PCLOS was going to make it through because it got to the splash screen but then it hung there, that shit with Mint was the only one that actually showed kernel panic, and everything else was freaking out on something and related but I can't remember exactly what

why would you have a better gpu on your HTPC compared to the main machine?

poor

Attached: 1491349230554.jpg (653x726, 115K)

>TFW this CPU goes for 150 € used
>Proper Am3 boards will be soon more expensive

Attached: 1700.jpg (611x606, 66K)

>this CPU
You talking about the 8350? cause that's like a $109 CPU new in box, the 990FX board situational is indeed terrible, I got my 990FX board for a little over $100, but there are people wanting $300 for these things new, they need to have their ass kicked for that.

Attached: IMG_0253.jpg (4032x3024, 2.06M)

No sane person would buy a FX processor now
I paid 250 € for a new 1700, which is a good deal, as it runs undervolted with higher clocks on a 60 e board

Get some used Samsung RAM to OC, 20 € per 4 GB stick, I use my old Crucial kit here
This is a monster HTPC that wrecks even Haswell-E HEDT

when AMD GPUs work on linux I will use it

i5 3570k @ 4.5GHz + dual 1050Ti here
how are you enjoying your subpar technology, computerlets?

they already do you dumbfuck

not really.

I like it quite a lot, thanks for asking

are you literally me?

Attached: sbegy.png (688x544, 44K)

AMD, for the money, tends to have 25-50% greater multithreaded performance while only 5-12% worse single threaded.

And all their CPUs are unlocked. So you can get a $90 unlocked 4 core and overclock it instead of getting a locked 4 core for $110 from Intel.

And their platform support is better. Don't have to change motherboard just to upgrade CPU ever year.

perf/watt of Vega64 is the same as GTX1080 if you undervolt it.
v64 cores draw more power because of half precision and some other things which are supported.

u r homosex

Attached: d.png (1920x1080, 773K)

FX 6100
HD 7950

it's set up as a remote desktop for doing computational science and also for CAD on the go, dual boots gentoo and has a win10 and openbsd vm.

kind of want to upgrade to a Zen 2 after finishing up grad school, keeping my gfx card for the time being. too bad gcn 1 has garbage opencl support on linux.

>purchase Vega 56 for $400
>mining craze starts
>sell for $900
>no one figured out mining on Frontier Edition
>buy for $700
>prices rise to $1000 weeks later
>still cheaper than $1200 Vega 64

Attached: userbenchmark02.jpg (1086x1964, 456K)

you won the thread, I guess.