/pcbg/ - PC Building General

>Assemble a part list
pcpartpicker.com/
>How to assemble a PC
youtube.com/watch?v=69WFt6_dF8g

Want help?
>State budget & CURRENCY
>Post at least some attempt at a parts list
>List your uses, e.g. Gaming, Video Editing, VM Work
>For monitors, include purpose (e.g., photoediting, gaming) and graphics card pairing (if applicable)

CPUs based on current pricing:
>Athlon 200GE - HTPC, web browsing, bare minimum gaming
>R3 2200G - Light 30-60fps gaming(dGPU optional).
>R5 1600 - $80 at Microcenter
>R5 2600 - Good 60fps+ gaming CPU; great value
>For extreme performance in gaming, rendering, and/or productivity, wait for benchmarks on 7/7

>Intel CPUs are now defunct. Even used i7 workstations are no longer worthwhile due to vulnerabilities and related performance regression

RAM:
>Do not use a single DIMM. 2 sticks is the only recommendation for a typical dual channel CPU
>CPUs benefit from fast RAM; 3200CL16 minimum
>AMD B & X chipsets and Intel Z chipsets support XMP

Graphics cards based on current pricing:
>Used cards can be had for a steal; inquire about warranty
1080p
>RX 570/580 8GB - Can be found on sale for cheap. Look for 570s which are >1240MHz boost
>GTX 1660 - higher fps or more demanding games
>Vega56 @ $300 / 2060 @ $320 or equivalent - high framerates (requires complementary CPU and monitor)
1440p
>Vega56 @ $300 / 2060 @ $320 or equivalent - 60-100fps
>Wait for benchmarks on 7/7
2160p (4K)
>Wait for benchmarks on 7/7
>RTX 2080Ti - best for 4K but expensive

General:
>PLAN YOUR BUILD AROUND YOUR MONITOR IF GAMING imgur.com/a/RTbKAxD
>Only buy a new monitor for gaming unless it's high refresh with adaptive sync
>A 256GB or larger SSD is almost mandatory; consider m.2 form factor
>Bottleneck checkers are worthless
>Don't use Speccy

rentry.co/pcbg-more

Last:

Attached: AMD-Ryzen-3900X-streaming-performance.png (1904x1071, 1.19M)

Other urls found in this thread:

youtu.be/ryTTiZLIkrk
newegg.com/p/N82E16824025938
i.imgur.com/Els9aDX.jpg
forum-3dcenter.org/vbulletin/showpost.php?p=12039828&postcount=8680
imgur.com/a/YkoOCgM#Els9aDX
twitter.com/SFWRedditGifs

is there any reason at all to get X570 over X470 if i don't give a shit about PCIe4?

is the 5700/XT worth it now

nope

what if i want to use PBO on my coming 3900X?

Just wondering, how loud are ryzen stock coolers and is there a point in buying aftermarket cooler(if only just for noise) that uses the same form factor (like a puck) and is better than the stock one, instead of those bigger coolers like CM 212 and similar
I've owned 212 EVO for 4-5 years now, changed the single fan into 2 140mm case fans and it's pretty fantastic for what I use it for, but it's a pain in the ass to remove and even a bigger pain to do anything inside the case while it's mounted, shit cuts like knives

any micro atx b450s that can bios flash without a cpu?

X570 makes sense if you were already paying a lot of money. The build quality on the mid to high end is probably the best build quality of any consumer motherboards because of the PCIe signal integrity requirements.
X470 has a place, but it's pretty small when you exclude boards so expensive you should get X570 and boards that don't offer anything that the B450 line doesn't already.
PBO exists on the 400 series boards and should support the 3000 series CPUs. At best there will be some options that you couldn't set on X470, like extending frequency range. However, if you are buying X570 it is probably still much better to manually overclock.
The Stealth is noisy unless you have a minimal heat load. The better coolers are tolerable, even with some overclocking or PBO.

Attached: IMG_20190703_021646.jpg (706x1000, 74K)

youtu.be/ryTTiZLIkrk [Open]

I saw some forum posts with that same glitch when loading games and they fixed it by installing an older video driver.
I just tried the first driver that I installed and it didn't fix it.
I also saw some posts that the dude ended up ruling out the motherboard, and for other one the techs said the CPU had a bent pin. Could this be because of the thermal paste that fell on my CPU socket? At first it was only happening with igpu overclock and it went away when I returned to stock, now it remains when at stock and it even requires more voltage to run at stock clock. Also, at first windows notifications would pop up over gta and look blurry and with some pink lines, now they look all fucked up with pink and white squares. All this on a computer that's not even a month old. could I really have damaged my SoC vrm or the iGPU?.

What about B450? I thought PBO was a CPU feature, not a motherboard feature so as long as the VRM doesn't heat up too quickly it can increase the voltages by itself.

well, at least your game works prieto-san

Will Navi support Windows 7 or do I need to go with Nvidia for that?

PBO relies on a lot of factors to be more complicated than it should be
The motherboard has to support it, ruling out 300-series chipsets. From there, board designers set their own baked in targets, leading some boards to perform better at stock than others, and when PBO is used you can get mixed results that make less sense than just power capability or temperature would lead you to believe. I don't know specifically what X570 will offer, but I think there will probably be a few chipset changes that would make it slightly better.
B450 is a good, but not for 12 cores. I would run 8 cores at most on even the best ones, since that was all they could ever have been designed for.

my guess would be a windows bug

Not him, but reddit has had that VRM power tier list across existing x50/x70 boards, with estimated stock and OC support of 3000s.

Not to shit on you, but why is your word better?
If the power deliver is there, it's there isn't it?

Is used optiplex +

Windows 7 had a good life, but it's time to find something else. Even if there is support it you will have shitty API limitations
B450s do not have the phase count to support 12 core overclocking. PBO would be ineffective, and I'm starting to question if you have any reasonable use for a 3900X.

upgrading from a 2500k to a 3600 or 3600x most likely, is there any reason why I should bother with x570?

What the fuck

Attached: uY7bUtz.png (959x995, 105K)

>3.6ghz 9900k
>2666 ram
shittiest benchmark I have seen in a while

Because of dumb mATX shills in these threads my case doesn't have enough back plate slots to fit two double-wide graphics cards. Do they make mATX cases that could work for this or should I just get a full ATX case, and then should I just get an ATX mobo also?

Mad at you small form factor faggots for recommending me this limiting shit. Would I see any benefit from a X570 mobo if I have a B450m with Ryzen 5 2600 now?

Is there any chance RTX 2060 is going to reduce in price with the SUPER making it useless? Aside from maybe a bunch of RTX 2060s showing up on used markets.

Mortar

Is there a way to actually test claims that X company does better drivers?

source faggot
also
>3200 MHz vs 2667 MHz
pure distilled faggotry.

retard, should've done your research
mATX will always be the best mobo form
whys the 9900 on 2666 RAM lol

So what is he minimum phase count for more than 8 cores?
5 phases? more than one power stage?
I'm trying to figure out if you have any real experience or knowledge with this.

The most you would want is a mid tier X470.
Can we please move past 1280x720? We get it, they are basically even.
It just means they didn't overclock, since turbo gets it to 5GHz anyway.

I know, why does intel only support 2666mhz in this day and age? Faggotry indeed.

1070 will easily run 144 Hz if you're not autistic about "muh ultra" placebo settings.

I want to buy this monitor
newegg.com/p/N82E16824025938

It’s 1440p 144hz, can my 1070 run that if not what gpu will

Thanks

>Needed 2 gpus but got mATX anyway
>Didn't do your own research when putting parts together
wtf are you doing.

Define Mini C has 5 expansion bay slots but it probably won't work for you because you didn't say jack fucking shit about anything else you require in your snowflake build.

>amd actually lowered the price of the cards
based

Attached: 1528995710560.png (527x531, 215K)

Intel is shit. Everyone knows this.

Yeah 720p is kinda retarded and not realistic, I was more curious about 1080p

Are any of the new AMD GPU's worth getting? I got a 1080ti.

Seems like the rx 5700 is getting a price cut so maybe

i.imgur.com/Els9aDX.jpg
Archived. Other krauts are posting more images and details in der förüms.

720p is the res that every intel and nvidia shill brings up when talking shit about pre-ryzen game engines.

How is it based? It means they literally tried to scam you if it weren't for nvidia releasing the new cards

>3700x dabbing on 9900k in muh 720 benches
OOOOOF

did it get deleted from their website? i just checked it and couldn't find it. I want to see the individual benchmarks specially for KKD.

Which intel CPU would you pair with a 2070 Super?

9600k/9700k/9900k are still better than the more expensive zen2 equivalent so going with any of these is fine.

Cant find the KKD one, just
>KC: Deliverance 9900K 9% slower than 3700X

>NEW!: 720p Gaming benchmarks don't matter!

big if true, post source plz. i will suck your dick for it.

Hello fellow gamers, i would like to know your average vega 56 experience, when used at medium settings for FPS games at 1080P on a high refresh monitor. If i want 144FPS at almost all times and am willing to go to low settings, for the sake of stable framerate, would it be a worthwhile purchase? Overkill?
Ultra settings are notorious garbage. i usually play path of exile, warframe, minecraft, BF4, Empyrion, spengies and rust.
My current rig with upgrade plans:
R5 1600 (Upgrade to 3600X)
MSI RX 480 4GB (Maybe navi)
8GB 3200 RAM @ 3000 for maximum stability (havent had a single crash in any game since this tweak, no perf regression so far, not upgrading ever)
Im waiting for navi, might get the normal 5700 or if its worth it, the XT. I wont buy an nshittya product after my 3 failed RMA 1060s ever again.
All i do is play games with this, i have a dual socket xeon server i outsource compile work to.

Attached: 1501762949312.jpg (800x800, 47K)

forum-3dcenter.org/vbulletin/showpost.php?p=12039828&postcount=8680
This is the source of that comment.
Here's the dump of of what was saved.
imgur.com/a/YkoOCgM#Els9aDX

It depends on how good the MOSFETs are. More phases is good, but having less is possible if you double up per phase or use really good parts.
None of this exists on B450, I think at best you could get a cheaply made 6-phase that isn't viable. The best bets for 12 cores are the doubled 5 and 6 phases on X470, unless you go X570.
Pentium G5400

Is crossfiring two Radeon VII worth it?

>all 720p benchmarks
into the dumpster it goes. I don't give a shit about "muh pure CPU power by removing bottleneck" i just want to see 1440p benchamrks so that i can decide whether to go 9900K or 3900X, fuck these idiots. I will just wait two more days for professional reviews and not this fanboy fueld 720p garbage.

>no image saved

Attached: 1562335168189.jpg (640x656, 43K)

are they still in production? I'm seeing that the gaming plus, mortar and bazooka plus all have bios flashback but I'm only seeing the gaming plus on merchant sites.
Is that good to go with or does it have some niggles?

Almost all game engines predate Ryzen. Cherrypick much? Anyway 720p helps a user to see the maximum framerates he can achieve if he turns down settings.

Not if you have a 1080Ti

Your 1070 can use Freesync, better not to get locked into the Gsync ecosystem. Also Freesync is way cheaper. Your 1070 is enough for a 1440p monitor but you might want to upgrade if you're not getting the performance you want

no, crossfire is dead, and SLI is dead. my friend has two 1070s in SLI and performance is shit in all games hes played, so he had to take the other one out. I had two 480s and it was a stutterfest for anything.

Coming from a 1050ti, how good is the 5700 non XT?

Attached: ohyes.jpg (320x320, 23K)

Crossfire is good for benchmarks and not much else
That would probably show no difference between any CPUs. If you're using 1440p, you are mostly GPU limited. Neither core count or frequency would likely matter past a 3600.

At the time I just needed a Linux desktop for brownsing and bullshitting but I decided I'd like a Win10 VM with GPU passthrough. If only you had recommended something with some damn upgradeability inatead of what's cheap and convenient and fits in your cuck corners.

Sold out, still in production mostly. Its insane, at the local Microcenter I've seen literally all the MSI flashback+ boards dissapear in a week.
Refer to the chart, you will have issues with more than 8core cpus. I wish this was in the OP.

Attached: dzbx9fdkxv731.png (1139x5016, 745K)

>Anyway 720p helps a user to see the maximum framerates he can achieve if he turns down settings.
Yeah at this point you get 304 fps instead of 302 fps wow

>not doing your own research of the recommendations

It isn't our fault that you're stupid
Stop asking for bad advice, there is no reason to buy x570 or probably anything else you're going to throw money at

>not giving due consideration to future needs when making recommendations
mATX is the worst shit in the world and I will make it my duty to fight it wherever it reveals its evil.

When talking about stuff like Duina, the poorly optimized ancient tech that Ubisoft never bothers to update for newer hardware. Favors 14nm intel processors because the same arch existed in 2012.

you would get 60 fps 1440p on everything probably, the 1050 ti cant even get 60 fps on 1080p so pretty big upgrade but wait for benchmarks

>blindly buying something because someone on a cambodian basket weaving forum tells you too

Attached: pwB7u2R.png (972x996, 108K)

You would probably also want to buy a new monitor

Attached: QRE4Otg.png (960x985, 101K)

Attached: U6dJxvm.png (947x975, 120K)

I don't get it, if you had the 3200 mhz ram kit right there why wouldn't you use it with both CPUs? why risk throwing all your work into the trash?

>3.8ghz
Into the trash it goes.

Officially supported settings presumably. Profiles exist for zen2 but not for coffee-lake.

ahem...you called?

Attached: 1559067841374.jpg (700x5000, 1.83M)

i really hope you're right user. I am suffering with my 4670k and it stutters like hell. I want something decent to drive my 165Hz monitor and Zen 2 is the promised boi.

>3900X competes with $1500+ CPUs

Attached: 1560227473323.jpg (725x350, 107K)

Attached: z42jIzc.png (956x990, 104K)

>3900X again beats 7980XE in single and multi
lol

because they test with official specs. Both CPUs go well beyond what is officially supported, you should be running zen 2 with 3733MHz for maximum performance, even more for intel

Quad cores are known to have that issue
Going from 4/4 to 6/12 is a world of difference, but 6/12 to 12/24 doesn't mean as much, since no game can utilize all of that. Even if the code was written perfectly, you would need to run multiple video cards to appreciate it.
The most I would seriously recommend for gaming is 8 cores, but the extra cost isn't entirely justified.

Probably about 4x as powerful. The 5700 is not a 1080p card

Doesn't change the fact that Dunia still gets used. That's like saying, wow PUBG runs like ass, better not test it! Plenty of people are still playing PUBG, and a while bunch of people are going to play future FC games. Clearly Dunia should not be used as a forecaster for future expected performance, but that's a bit up in the air anyway, considering no one knows just Zen consoles are going to affect PC requirements / achievable performance (aka devs targeting 30FPS with a 3700X or optimizing extremely poorly)

just look for benchmarks of the 2600x vs 2700x, unless the game uses the extra cores it wouldn't be much of a difference

Why is 9900k there twice, and in one case has 196fps minimums and another case has 220fps minimums?

But yeah it was obvious from the start that people will still be able to find games to cherrypick for Intel.

Yeah, I don't know why KCD isn't a common test when it is *the* most demanding game.

Around 3x the performance.

Only if what programs you use will actually scale up 2x with it. Divinci's resolve does not.
Or do you mean for gaming? No that's retarded.

N/A.

I also want to run background programs as well. that is why I am planning to get the 12 Core 3900X, but it is likely that i will change my mind after the 3rd party benchmarks are out. 300 Euro difference between the 3600 and the 3900X is big deal to me.

thank you so much
I'm planning on a 3600 so the gaming plus seems like it should be fine
I'm not ordering until october so the others may be in stock by then for me to review my options anyways

Want to buy rx580, Nitro costs just 10 euros more than Pulse. Should i go in or buy 3 packs of carte doir icecream instead?

>Why is 9900k there twice, and in one case has 196fps minimums and another case has 220fps minimums?
I'm guessing a 95W limit vs no power limit, based on the description
The minimums could be a statistical anomaly that should have been thrown out

Connected my new Vega 64 to the mobo, looks like it doesn't recognize it, power supply is a tacens mars Gaming 800W 80 Plus so I don't think that's the problem. Any suggestions /b/ros?

>shills Intel CPUs, saying RAM speed doesn't matter and 2666 is all you need
>says it matters when benchmarkers don't run OC RAM speeds

Attached: (you)2.webm (853x480, 1.43M)

You're about to be the next guy to fry his motherboard. Return the V64 and get a decent card instead.

Update your bios

>I also want to run background programs
Like, rendering a video while gaming? That's about the only use for 12 cores. I'm very firm about 16 threads being the absolute most you should buy for games these days. Anything more is an absolute waste of money and would overclock substantially worse for the same cooling power.

95W limit and unlocked. Most boards have it boost to ~5GHz but then it sucks down like 200 W under load

>but 6/12 to 12/24 doesn't mean as much, since no game can utilize all of that
Kingdom Come Deliverence and Star Citizen can.
But yes, 6c/12t is more than enough for 99%+ of games.

Did you plug the power connectors in to the graphics card?

>(you)2.webm
I guess there is a (you)1, pls post

if you are not willing to share then i would get the nitro

Not once have I ever shilled for any company nor have I said 2666 is all you need because 3200 is the king frequency
I'm just curious as to why the 9900K was on 2666
benchmarks should always have identical hardware

Why is there no 3600/x ?

Attached: 1538595730147.jpg (750x747, 235K)

Which is why the 720p test occurred in the first place, to account for the shitty engine optimization when you take off the GPU crutches.
I think we both agree that it has to happen in a comprehensive benchmark, as unrealistic as it is.