/pcbg/ - PC Building General

>Assemble a part list
pcpartpicker.com/
>Example gaming builds and monitor suggestions; click on blue titles to see notes
pcpartpicker.com/user/pcbg/saved/
>How to assemble a PC
youtube.com/watch?v=69WFt6_dF8g

Want help?
>State the budget & CURRENCY
>Post at least some attempt at a parts list
>List your uses, e.g. Gaming, Video Editing, VM Work
>For monitors, include purpose (e.g., photoediting, gaming) and graphics card pairing (if applicable)

CPUs based on current pricing:
>Athlon 200GE - HTPC, web browsing, bare minimum gaming (can be OC'd on most mobos with the right BIOS)
>R3 2200G - Recommended minimum gaming
>R5 2600/X - Great gaming or multithreaded use CPUs
>i7 8700/K or i7 9700K - Extreme setup for absolute max FPS
>R7 2700/X - VM Work / Streaming / Video editing

RAM:
>Always choose at least a two stick kit; 2x 8GB is recommended
>CPUs benefit from high speed RAM; 3000CL15 or 3200CL16 is ideal
>AMD B and X chipsets and Intel Z chipsets support XMP

Graphics cards based on current pricing:
>Used cards can be had for a steal; inquire about warranty
1080p
>RX 570 8GB - good performance with great value
>GTX 1660 - standard
>RTX 2060 - very high framerates (requires complementary CPU and monitor)
1440p
>RTX 2060 - standard
>RTX 2080 - very high framerates (requires complementary CPU and monitor)
2160p (4K)
>RTX 2080 - standard
>RTX 2080Ti - better fit for 4K but expensive

General:
>PLAN YOUR BUILD AROUND YOUR MONITOR IF GAMING
>A 256GB or larger SSD is almost mandatory; consider m.2 form factor
>Bottleneck checkers are worthless

Attached: 1550146796076.png (1522x869, 813K)

Other urls found in this thread:

pcpartpicker.com/list/Mbf8V6
techspot.com/news/77026-windows-update-reduce-spectre-meltdown-patch-performance-hit.html
reddit.com/r/Vive/comments/9gr3xs/wtf_i_disabled_spectre_meltdown_patch_and_gained/
reddit.com/r/emulation/comments/acgj15/is_there_any_performance_loss_due_to/
tomshardware.com/reviews/gaming-performance-meltdown-spectre-intel-amd,5457.html
github.com/speed47/spectre-meltdown-checker
m.youtube.com/watch?v=ng8Wa_jwwx8
twitter.com/SFWRedditImages

Previous:

R8 and h8.

pcpartpicker.com/list/Mbf8V6
(all prices are in € for a local shop)

>RTX 2060 for 1440p
>Not RTX 2070
When will the 6GB meme end

>no 9400f
yikers

>tfw u wanna upgrade from haswell
>Tfw that means dropping about 500-600
>Tfw ddr5 soon(tm)
>Tfw am4 is nearing it's end

Swap 9400F for Ryzen 2600 or 2600X. Try to save up a tad more and go for the 1660Ti.

Im in the same boat, coming from a 4690k

Im waiting for Zen 2 6c/12t and 8c/16t benchmarks. If they still can't make up for their lack of IPC plan is to grab a used 8700k and call it a day and wait another 5 years before considering another upgrade.

Could easily get $250 for my current board, cpu, and ram.

It's fine, although I would pair an R5 2600X with a 1660 or go i7 8700/K if I were making an Intel build

There's nothing wrong with 6GB of VRAM for 1440p, but the 6GB meme will end when AMD releases Navi. Pretty much undoubtedly Navi will be the baseline recommendation for 1440p.

Nothing really wrong with the 9400F, it's just for Intel fanboys

Just upgrade to a Haswell i7 if you don't already have one, and maybe drop in some more RAM. The i7 4790K is as good as any recent Ryzen in a lot of titles

Same cpu as me lol.
It's starting to show its age in games, especially since I swapped to a 1070.

>tfw 4570
I want moar coars
Or another PC entirely

It's good, but I'd get a Budget z370/390 motherboard for 10 bucks more, so you have a upgrade path

what is YOUR excuse to not have RGB in your PC?

>more airflow on silent fans because they have shit CFM
>see in the dark properties to show you cant do cable management to save your life
>some come with a remote controller for maximun spaghetti spillage when someone visits your man cave

>poozen
LMAO

I switched the 100+ FPS builds in the OP pcpartpicker back to i7 8700s. In most games, an R5 2600X can max out a 2060 at 1080p or a 2080 at 1440p, but there are a very few instances where the R5 would struggle: for example, Hitman 2.

I'm not really sure that I should recommend an 'insurance' CPU (the i7) that's only necessary in a few situations. Please comment on this change and also critique the builds in general

Attached: hitman 2 cpu vs gpu comparison.png (1611x927, 152K)

FYI the charts on that website are fake.
They test 2-3 setups then "guess" the rest based on those results.

I have been wondering about that, but I think they test all the graphics cards with an i9 9900K and all the CPUs with a 2080Ti and only guess when they provide the option to select another CPU. The most definitely test more than 2-3 configurations

no need to upgrade from a haswell i7 assuming that it's overclocked.

Worth upgrading a i5 2500k to a i7 2600k in a older build for $70?

Yeah

Attached: average 2600K.png (1920x1080, 1.05M)

yes, more cores is better, you will feel it in most games

You expect the guy who makes the shit "pcbg" pcpartspicker builds to know that lol
Probably the same retard who posts these unpatched benchmarks which don't show just how trash pre-Skylake is with the spectre and meltdown updates despite being called out on it repeatedly.

No. Do a real upgrade.

Rate my build

Attached: 1st build.png (1224x1280, 2.85M)

yikes/10

Oh look, the resident AMDummy is back.

Nice case mode
Get an SSD when you can and you're good to go

>unpatched benchmarks
Absolute fucking meme, shithead. The patches have no effect on gaming

>Wasn't satisfied with three different RTX 2060 card models
>coil whine, high temps, retarded fan curves
>Fuck it
>Go to get a 2080
>Deciding between the EVGA XC Ultra and MSI Trio
>Still sour about that whole EVGA VRM debacle, so go with MSI
>Thing is fucking massive
>Not only is it 3 fans long, but it's also three slots long (I know you guys warned me)
>Will probably never fit into anything but a full-size ATX case
>The rest of the system now feels in-adequate (2600)

Mistakes were made.
At least it makes me look forward to my next upgrade.

Attached: 1276715378464.jpg (312x266, 7K)

are the costa rica made i7's better quality or the malaysia ones are?

I have those patches disabled on my 2500k because the system is old and doesn't use the web browser much. It's mostly a secondary gaymin and htpc build.

You overpayed by like 100 euro, or 80 euro if it also came with monitor
>no ssd
>10 year old parts except for the 3 year old GPU.
>1333 RAM

I'm amazed by you retards who think you're getting a "deal" by saving just a few dollars on getting 10 year old ancient parts, slow RAM, SSD, when for the same price you could have made a NEW Athlon build with an actual upgrade path where you could have sniped some good GPU and CPU sale and the future with or gotten newer used parts with.

Jow Forums would it be worth to get Corsair Vengeance LPX 32GB (2x16GB) for a 150 or should I buy something better?

These ghetto-rigs always get me
Reminds me of my young and dumb(er) days when I would pull the same shit
>Using duct tape to mount hard drives
>Accidentally frying your motherboard trying to fish a stray screw out
>Buying a BFG psu from Circuit City
>Back when Antec was still relevant

Ah, the retarded memories

Attached: 1337681604650.jpg (1024x683, 338K)

>The patches have no effect on gaming
You know you are blatantly lying yet you do it anyway because you get off on tricking people into buying trash.

>Don't use browser much
Unless you literally uninstall browsers and block port 80, you're fucked.
Some game's launch is going to open an embedded web browser for its patcher or news or some shit that someone has compromised, whether hacked or a rogue employee, and it's going to own your PC and you'll have idea with how impossible it is to detect.
That or you patch it and enjoy the 15%+ performance loss in GAMES, unlike the lying shill who won't acknowledge the huge performance regression in patched pre-Skylake.

>worth it
Yeah dude you can flip those for $300 to zoomers who are sticking those in their urethras now days but who are blocked from buying them. Is what what you meant by "worth it"?

Overclock CPU and get a 1440p monitor.

>You overpayed by like 100 euro
>leaving 160 euro
Not him, but the 1060 is worth 80 euros at the very least. The i7 2600 is WAY, WAY better than an Athlon. The DDR4 alone would be half of the 260 euro budget

You're just a shithead who shills AM4 at all costs

Already using a 1440p monitor, so that's half the reason I got it
Was thinking the same thing about overclocking, mind I've never been a fan since I like my shit cool and quiet

The 2600 does not have more cores than de 2500, it just has HT which is pretty worthless, the ""major"" actual advantage is higher clock speed and more cache.
Use the correct terminology.
I would not spend $70 on that upgrade, it isn't worth it. $40 at most.

good thing I mostly play private servers of old MMOs without patchers, single player offline games, and emulators.

Are old Optiplex cases compatible with standard hardware or did they do some weird pseudo-proprietary bullshit?
I can get the case for free from my job and that would save me like $75 on a case that I could then put into other parts of my build.
I'm looking at mATX and ITX mobos, fwiw

i just noticed that even with such old stuff he should be able to do 1080p74
you need a new CPU fan asap and id add a exaust fan or you will cook your mobo and GPU eventually

is that a soundcard or a ethernet card? sure hope you asked something like 15 off because of the fucking case rip

>patches affect gaming
They don't, shithead
techspot.com/news/77026-windows-update-reduce-spectre-meltdown-patch-performance-hit.html
>1-2% after Windows update

Your posts are so easily disprove you must get off on being shit on over, and over, and over again

>Not him, but the 1060 is worth 80 euros at the very least.
It's obviously a used GPU. Those are like 100euro.
160euro for the rest is severely overpaying. The rest of those components are like 50 euro max.

Just because people overpay for obsolete 2600ks (and that's a non-k) doesn't make it worth it compared to alternatives you can spend the money on.

If it's not a used GPU, that's even more stupid.

It'll probably still do 1080p@75 in most but that doesn't make it a good build for the money.

2600 overclocked with a 2080 at 1440p is going to be fine.
>like it cool and quiet
Well then don't OC. Just limit your FPS to 100 or 120 or whatever you're comfortable with.

>on a pre-Skylake CPU
Oh wait no, it doesn't say that anywhere, lying shill retard.

when will navi drop

The 2500/2600 processors are already feeling their age now
No reason to cling to them beyond sentimental value when Intel didn't go complete jew on their ecosystem.
>Overpriced processors
>Overprices boards
>Paying more to overclock
>Paying more for hyperthreading
>Doesn't have a plethora of exploits
>heatspreaders were actually soldered
>And didn't run hotter than the sun

Attached: 1274145744292.jpg (481x319, 27K)

>The 2500/2600 processors are already feeling their age now
for above 75hz and workstation uses yes, but for 60hz gaming no.

>it just has HT which is pretty worthless
False, you can see that in a CPU limited scenario like here , HT adds ~30% going from the i5 7600K to the i7 7700K after accounting for clockspeed increase

I'm not opposed to overclocking, it just been years since I've formally done any research

Mind giving me the rundown?

>The rest of those components are like 50 euro max.
Absolutely false. You have no concept of value

>pre-Skylake CPU
The link says all Intel CPUs, which includes pre-Skylake CPUs, retard. Keep getting shit on

after computex AMD will tell you how many months you will need to wait to pay premium for their new dual GPUs

can you make a under 300 build for 1080p60?
im almost certain they go for 400 even when used now

Worst poster ITT

is threadripper a meme?

Any recommendations for CPU coolers? My CPU itself is really good but it tends to overheat a lot

NH-D15S

or just the D15

>The 2500/2600 processors are already feeling their age now
Relative, depends on your uses. But yes for "muh gayming 1440p" "144fps" or other CPU intensive tasks, sure. Even for 1080p/60 they handle themselves fine in most scenarios, barely starting to show their age.
They are generally overpriced, I got my current 3470 for 80 bucks en ebay, back in late 2016, already had the rest of the computer and upgraded from a G550. And even though I considered buying a 2600/3770 for +80 bucks, in the end I realized it wasn't worth it for the marginal difference, I would for 40 at most perhaps.
I would never recommend anyone buying used 1155 today, that would be really stupid, I have an H61 since 2013 and works fine so that's different, if I needed HT the cheaper solution would be to get the used i7, it just happens I don't because for videogames it is pointless and I don't virtualize much outside of work.
>HT adds ~30% going from the i5 7600K to the i7 7700K after accounting for clockspeed increase
Nowhere in the image to take that conclusion from. The advantage of the i7 is for the clock speed and cache mostly for sure.

i hear the NZXT kraken is good
if you don't mind the RGB shit

Tech press are lying.
They have agreed with Intel to not publish tests of patches effects on performance.

reddit.com/r/Vive/comments/9gr3xs/wtf_i_disabled_spectre_meltdown_patch_and_gained/
reddit.com/r/emulation/comments/acgj15/is_there_any_performance_loss_due_to/

People were seeing a 3-10% loss in gaming post-skylake and 12-20% pre-skylake through dozens of independent tests. Even still as of 3 months ago, after the optimization update from Windows.
People were losing 10% in "le best emulator CPUs" in multithreaded emulation like PCSX2.

Some people did tests in early 2018 which claimed were prepatch and postpatch and that there was no regression in games in older hardware, but they colossally fucked up and not realizing those microcode updates didn't work on pre-Skylake at the time. Ones using the newer fixes from Windows which actually work, and report coverage with the vulnerability checker, show the performance regressions.
Fuck you are so retarded.

used haswell i7 for $150-$200 with a used 570/580/1060-6Gb GPU will, yeah.
Though I'd generally recommend spending $300 on something new, and then saving up and watching sales to upgrade it into something legitimately good over the next year or two instead of being stuck in something you will have to completely replace instead of being able to upgrade.

These, or just about any other oversized heatsink cooler
Just avoid the closed-loop coolers like the plague
At least with Noctua, they'll send you free adapters for newer mounts

rate my budget windows 7 build.

Attached: Screenshot_20190407-120206.png (720x1280, 124K)

the mental gymnastics done by amd fanboys is astonishing

>intel
>windows 7
>Seagate
>phoneposting

>People were seeing a 3-10% loss in gaming post-skylake and 12-20% pre-skylake through dozens of independent tests. Even still as of 3 months ago, after the optimization update from Windows.
3 months..? What are you talking about? Retpoline update was only released a month ago. It fixes AMD and pre-Skylake performance (which was affected by mitigation too at it seems). Skylake performance hit wasn't that big from the start if I remember correctly.

the mental gymnastics done by intel fanboys is astonishing

what can i say i love my windows 7. i wanted to go ryzen but my windows 7 wouldnt work . i love amd, just not as much as my windows 7.

Attached: 1552939036401.jpg (540x632, 47K)

this right here. I'd go with the gigabyte z390 UD

Get a 2600, cheaper board, and an SSD.
And use Windows 10 or Linux. Windows 7 is no longer supported in another 8 months.

Why aren't you acknowledging the dozen of independent sources and their tests done?
Oh right, because you're a manipulative, lying shill.

Run the tests on a live stream with a Sandy, Ivy, or Haswell CPU, verified that spectre and meltdown is mitigated, yourself.
That's too much effort compared to basing your bullshit off of completely unverified and unsourced "tech" articles, huh?

>The advantage of the i7 is for the clock speed and cache mostly for sure.
Cache has an advantage, but I already accounted for clock speed increase. Don't forget that the i5 2500 to i7 2600 also has an increase in cache size.

>muh f4ke n3ws if it doesn't agree with me
The first literally red dit post is retardation, and the second doesn't have anything but speculation.

Get fucking wrecked. Here's another link that says exactly the same thing

>This first round of Meltdown and Spectre patch testing proved fairly uneventful. There really wasn't much to report across our suite of game benchmarks.
tomshardware.com/reviews/gaming-performance-meltdown-spectre-intel-amd,5457.html

Attached: spectre meltdown gaming performance.png (1119x931, 940K)

That's an interesting graph
Does it mean at some point the price per transistor will cease to decrease?

Keep flailing around, shithead. You're wrong over and over and over and over again

>Run the tests on a live stream with a Sandy, Ivy, or Haswell CPU, verified that spectre and meltdown is mitigated, yourself.
Which benchmark? I'm kind of bored and might do it just for fun. Got only Valley installed. Will run it for now.

Win7 isn't officially supported on 8000 and 9000 series Intel CPUs
You can run it on Ryzen, but you have to manually download and install patches.

>Blanket baseless claim
>no tests
>nothing about pre-Skylake CPUs
You genuinely think what you posted discounts what I posted?
You are RETARDED if that's what you think.

But I think you know you can't discount it and are just attempting to distract others from how unbelievably and unredeemly wrong you are now.

At some point. It has roughly stagnated.
What you have to keep into account, that that graph doesn't represent, is that die sizes are smaller so the cost PER CHIP still hasn't increased yet.
And even if it does, the actual wafer cost per chip is something like a few dollars.

I'd spend $35 more on a 212 or be quiet slim cpu cooler, spend 35 more to get 3000 cas 15 ram, go with the seagate 2 TB and get a ssd later, buy one of the slightly cheaper fractal design focus cases (black or white should be cheaper if I remember correctly), and check to see if the power supply is at leat bronze certified. if it isn't bronze, I'd get a seasonic 620 bronze

so will my windows 7 work at all? I really dont want 10

Attached: 1508555626477.jpg (550x760, 74K)

Run some DX12 benchmarks and multithreaded emulators. They were generally affected the most.
Newer Totalwar game, Forza Horizons 7, Ashes of the Singularity..
And Doom but only if you're not over 200fps on the CPU to begin with. You'd probably have to underclock to see the regressions there.
PCSX2. Cemu in multithreaded configuration.

Use github.com/speed47/spectre-meltdown-checker to verify patches are enabled/disabled without cutting the stream.

$15-25 cooler is fine for a

It's fun to watch you suffocate and double down on stupid

>tfw jumped from windows xp to windows 10 in 2017
Feels good man.

fair enough. Also I noticed that he states that he can buy a 9400f for cheaper than a 2600. the asrock phantom is a pretty good board based on that overclock infographic that is usually posted. So for 130 I think that his mobo and cpu combo is solid, although he won't get much value from the mobo until he upgrades.

Anyone else having a super hard time deciding on a budget?

Attached: 1486594542834.jpg (261x210, 12K)

>only "citation" is a baseless unverified and uncited claims in an article, which is better than dozens to hundreds of verified and independent claims

9400f is like $5 cheaper but a good board costs more, so really it's more. Intel charges more for their chipsets.
$5 isn't worth sacrificing SMT, either. The multithreaded performance of the 2600 is more than 30% better. Even for "single threaded games", you run more than one application at a time so anyone who buys that and gets tricked into buying a 6c/6t for "$5 cheaper" is extremely gullible.

I put off building a new PC too long and now there are no more solid side panel Meshify Cs in Australia
Fuck the tempered glass jew and fuck me for being complacent

good lookn out on the psu. is there something wrong with 3 tb hdds? ill only save 10 sheckles by going 2.

deciding parts on a budget, or deciding on a budget as in total cost?
its never wise to pay for diminshing returns and nvidia especially have gone pretty insane, intel a bit too

>The i7 4790K is as good as any recent Ryzen in a lot of titles
no it is not, it suffers in the 0.1% and the 1% fps.

4670k with 1080Ti here. Hold me bros.

Attached: 1548359237298.jpg (424x394, 12K)

Got no idea about DX12, but DX11 and anything GPU bound is safe (at least with latest win10).
Those VRMark tests in your link were GPU bond and probably DX11 too.

Any DX12 bench recommendations that aren't a few GBs in size?

Attached: spectre_valley.png (1600x1800, 1.53M)

for a9400f and a 2600 with their stock coolers overclocked to 3.9 Ghz, intel wins on performance
m.youtube.com/watch?v=ng8Wa_jwwx8
calling performance 30% better is a huge lie, especially with the type of ram and cooling he chose

Also a tomahawk, which is probably the only board on par with the phantom below 130 is $15 less. so a $10 dollar difference isn't far more valueable, especially when you consider the 9400F beats the 2600 at 3.9 ghz

ignore the 9400f shill. does anyone know how to filter keywords in 4chanX? it is the same shill selling that shitty CPU over and over in every single /pcbg/.

Attached: 771.jpg (512x378, 20K)

>that min FPS cut almost in half
>900p
But yes, as I said, multithreaded emulators or DX12. You aren't going to see much on an old single threaded GPU benchmark.

I'm curious if the 5775C's EDRAM would help or hurt it or if it has its own special vulnerabilities, too.

>calling performance 30% better is a huge lie
Um what? 972 cb on the 9400F, 1274 on the 2600 STOCK and over 1400 overclocked. That's MORE THAN 30% even stock. Holy fuck you are THIS retarded. You could have avoided posting something so stupid in less than 2 minutes of research.
And MC enabled on the 9400f vs only 3.9 on the 2600 lol.

Sirs, from when the circumstances permitting does the user elect to purchase a Vega 64 or Radeon VII? I am experiencing the struggles of justifications, and I am unable to fulfill the needful to recommend product to the client.

Should I get a 2070 or 2080? I can afford the 2080 but I feel like it's a waste of money since there are no games that need it, and by the time those next gen games release there will be newer lines of GPUs.

>that min FPS cut almost in half
Half min FPS are with Spectre disabled, but I'm pretty sure 0.1% lows are the same.
Downloading 3d Mark demo now. But it's kind of huge. Will probably test DX12 some other day.

Cinebench scores are meaningless when they're not actually representative of game performance. Like I said before, the 9400F beats the 2600 in games when they're both at 3.9 ghz. you can't deny that. stop being so misleading, especially when an user has a uniquely good deal with a 9400F

:(

Attached: 1554666988999.png (1800x850, 216K)

hello Sir, thanks you for contacting Microsoft today. If you have FreeSync (Trademark) monitor that is 1440p then just buy the cheaper of both cards. These were designed for 4k gaming so the lower resolutions they will be the better choice if you factor in the insane prices of Gay-Sync (Trademark) monitors.
Please give one minute of your time after the call ends so that you can give me feedback Sir.

Radeon VII is really good for some productivity work. Divinci's Resolve, Octane, Blender, Tensorflow, etc.
For gaming, it can be hard to justify spending over $300 or $400 on a GPU in general because the performance/$ drops dramatically.
It's like 25-30% better than Vega 64, but costs almost twice as much.
And because it's so loud, really you'd want to replace the cooler which isn't free to do.

Get Vega 56 or a used 1070Ti instead of buying into those snakeoil lies.

>Games represent multithreaded performance
You do this all the time and I can't tell if you are THAT bad at reading things which are clearly written, or if you are intentionally ignoring the point made and facts presented to deflect.

if you're on 1440p 144Hz then get the 2080 otherwise you're burning your money away.

>look at the R5 sales
This proves that the "muh 9900kek is teh fastest in the market, AMD is teh losers" is a fucking meme. The high end market is nothing compared to the mid-range consumer base.

What do you think matters to the user who posted their question? Multithreaded performance or game performance?

MODS do your job this is CLEARLY anti-semitism

Now imagine spending $1500 on a 2080 ti to play children's video games

Attached: 415561.jpg (259x194, 5K)

see >you run more than one application at a time so anyone who buys that and gets tricked into buying a 6c/6t for "$5 cheaper" is extremely gullible.
gullible boi