/pcbg/ - PC Building General

>Assemble a part list
pcpartpicker.com/
>Learn how to build a PC
Search youtube for a build guide for your socket
>How to install Win7 on new CPUs
pastebin.com/TUZvnmy1

Want help?
>State the budget & CURRENCY for your build
>List your uses - e.g. Gaming, Video Editing, VM Work
>For monitors, include purpose (e.g. photo editing, gaming) and graphics card pairing (if applicable)
>Don't use Speccy, you retard. Use HWinfo, SIV, etc.

Overclocking
>Use PBO on Ryzen. Legacy overclocking is defunct on Ryzen 2#00X CPUs. youtube.com/watch?v=FC3fsVk9Sss

CPUs
>R3 2200G - Bare minimum gaming (dGPU optional)
>R5 2400G - Consider IF on sale
>R5 2600/X - Good gaming & multithreaded work use CPUs
>i7-8700K - Best for 1080p gaming, but most expensive when factoring in delid, high-end cooler, etc.
>R7 2700/X - Best high-end gaming/mixed usage on a non-HEDT platform
>Threadripper/Used Xeon - HEDT

Motherboards
>Only Z300 series Intel boards can utilize fast memory

RAM
>8GB - Enough for most gaming use
>16GB - Standard for heavy use
>32GB - If you have to ask, you don't need this much
>Current CPUs benefit from fast RAM; 2933MHz+ is ideal

Graphics cards
1080p
>RX 570/580 /w Freesync or 1060 6GB are standard 1080p 60fps+ options
>1050Ti or RX560 for lower settings 1080p, or older games
>GTX 1070Ti/Vega 56 if seeking higher fps & you have a CPU + monitor to match
1440p
>Vega 56 /w Freesync, 1070Ti if you already have Gsync
>GTX 1080Ti if seeking higher fps & you have a CPU + monitor to match
2160p(4K)
>Titan V
OpenCL work
>Vega 64

Storage
>Consider StoreMi
>Consider getting a larger SSD (better GB/$) instead of small SSD & large HDD
>2TB HDDs are barely more $ than 1TB
>M.2 is a form factor, NOT a performance standard

Monitors
>Always consider FreeSync with AMD cards
>Lock to 72fps on 144hz non-Gsync monitors with Nvidia cards to prevent tearing on more demanding games
>PLAN YOUR BUILD AROUND YOUR MONITOR IF GAMING

Previous:

Attached: 3e6f02ced79f9308118c5b73be551ab0.jpg (520x399, 27K)

Other urls found in this thread:

pcpartpicker.com/product/shtWGX/corsair-case-cc9011086ww
pcpartpicker.com/product/Yn7CmG/thermaltake-case-ca1d400s1nn00
newegg.com/Product/Product.aspx?Item=N82E16817438094
youtube.com/watch?v=V48KJEP1-sE
twitter.com/SFWRedditVideos

>Lock to 72fps on 144hz non-Gsync monitors with Nvidia cards to prevent tearing on more demanding games
Why is this still on OP

Because usually when this user makes the OP, he just copy and pastes.

I haven't been paying attention to edit it. You could have done so as long as you hopefully didn't do something really retarded like recommend the 8100 or 8600k.

>in the mean time than it get a 3Gb
Sorry, not sure what you mean? Is it go with a 1050ti or listed 1060 3gb aren't that bad?
I was asking because the price difference between a 1050ti and cheaper 1060 3gb isn't that big so I was thinking about the 1060 since I want something that would last me a few years in 1080p games or possibly even more if I decide to stop gaming or whatever.
But people seem to leave a lot of negative comments about those "entry" level 1060 3gb (Asus Dual, MSI armor, Gigabyte WF, Palit JetStorm) and talk about them like all of those a total trash and you have to buy a premium 1060 3gb like Gaming X etc.
So I thought maybe someone here has those lower cards and might share his opinion and debunk those negative comments.

first time building a pc. im pretty overwhelmed desu. going to lurk this thread to be less confused

it's pretty easy don't worry

Let us gather in prayer Jow Forums


My new rig has been shipped today, tracking says delivered on Friday. Let us pray I get a speedy driver. Hoping for a Wednesday or Thursday delivery.

Attached: 1354496053916.jpg (1024x768, 564K)

It's the easiest thing Non

>pick motherboard first
>motherboard will say "you need a cpu with THIS socket type
>buy cpu with that socket type
>motherboard will say" you need ram with THIS pin type
>buy ram with that pin type
>buy gpu
>buy ssd
>buy disk reader if you need it

Done deal

Will a vega 56 play games on high/ultra settings for another 3 years at 1440p...? If that's my goal am I better off just getting a vega 64?

What rig did you get?

quite obviously yeah
vega 64 isn't much better
you get some more memory overclockability, but the additional compute units do nothing as far as gaming is concerned

Traded my i3 PC for parts and assembled this thing today. Fuck Intel Spy Engine

Attached: amd2.png (1714x960, 2.73M)

Tom is my new best friend.

Vega 56 and Vega 64 are clock for clock the same performance. Only difference between their performance is the difference they're clocked at stock, and that Vega 64 generally have better binned memory which may clock higher.
Vega 56 already, stock, has the compute performance of a 1080Ti. Games just don't use that much compute even for the most complex screenspace shaders, generally.

You are best off getting Vega 56, undervolting and overclocking it. If you can get core to 1570 (which is a gaurantee), and memory to 945Mhz (which is almost guaranteed), you have Vega 64/GTX 1080 performance for much cheaper.
There is the possibility that a good Vega 64 model and silicon lottery can you get you more like 1080Mhz memory, and a 10% increase in memory clock speed is good for about 6-7% fps increase, but there's also the possibility that your Vega 64 doesn't have great memory either so I wouldn't waste the money unless it's literally the same price for a good model.

But yes, to answer your question, in the current most demanding games, Vega 56 undervolted and overclocked tends to get at least 75fps at 1440p on ultra, at least with Gameworks disabled.
And if you can drop down to high settings, that generally gives a massive increase in FPS. You generally don't need so much AA at 1440p either, which also ups framerate.
I play anywhere between a mix of medium and maxed or max settings on my RX580 on 2560x1600 and Vega56 with optimized clocks and voltage for gaming is around 60-75% higher performance.
Console games have really been pushing 4k optimizations, with those optimizations sometimes available on PC, as well.
You'll be fine for a few years.

>going to lurk this thread to be less confused
That's a good idea. Or check the previous one or two.

You can learn a lot by answers to other's questions and criticism of people's builds.

Nicer case pls

Forgot picture about Tom being my new best friend.

Attached: vivaldi_2018-08-06_16-06-00.png (1084x962, 87K)

>78º

Yikes

Any alternatives to the bequiet sp e11 that are silent, besides the dpp11?
I want something that is rated higher than 40°C butt still silent

Oh fuck there's more.
I'm pretty certain that GTS 250 is actually less powerful than that APU. Or at best equal to it, and better in some games and worse in others.
If only your memory wasn't so awfully slow. If you had 1866 dual rank I'm pretty sure it'd consistently outperform the crappy dGPU.

and 59hz I'd die.

Attached: a10-5700.png (550x470, 44K)

Emulation performance:
[x] Intel
[_] amd

I can see your your thinking in your Vega 56 vs 64 and it makes sense. However I own an EVGA GTX 980Ti Classified model, i.e. the overkill, over the top model. I was eyeing up an Asus Vega 64 Strix model. I like to keep my cards for a while and I want to really cash in on that AMD FineWine technology.

For the potential OC headroom and/or binning, is the Vega 64 good for 1440p? I only play 60FPS and my monitor is neither freesync nor G-sync.

>People chose shitty 1080p TN panels over gorgeous picture because merchant promissed them some advantage in first person shooters.

I feel bad abouth them.

Attached: 2018.08.06-22.51_01.jpg (3840x2160, 2.92M)

Which mATX case do I buy, Jow Forumsentlemen?
pcpartpicker.com/product/shtWGX/corsair-case-cc9011086ww
pcpartpicker.com/product/Yn7CmG/thermaltake-case-ca1d400s1nn00
If you want to suggest something else, dimensions need to be less than:
>32 cm wide
>46 cm high
>depth: 49,5 cm - 0.25 * height (my table has slanted legs)

When it comes to getting video/graphics/gpu on linux is my understanding correct that

Intel
> is total shit, only useful for office applications and surfing, nothing demanding

Nvidia
> usually moderately greater in capability, performance than AMD
> more closed, proprietary due to lack of technical information sharing with outside developers
> open source drivers based on reverse-engineering like Nouveau are garbage
> proprietary drivers are locked up tightly, binary blobs everywhere
> G-Sync is better than FreeSync, but compatible G-Sync monitors are much more expensive
> comparable cards of the same "generation" are more expensive than AMD
> overall better for gaming on windows

AMD
> more open, friendly with outside developers
> open-source drivers are actually pretty good, but not as performant as proprietary Nvidia
> generally a little weaker than Nvidia for gaming
> cheaper than Nvidia
> overall better for OpenCL/GPGPU programming on linux (except CUDA)

I kind of wish I had waited to buy a high end monitor but I bought 1440p when it was still the latest and greatest. Bought a Samsung 32" 2560x1440p IPS panel for $850 back when even garbage tier mediocre 4k 27" monitors were $550+

you're really better off keeping your current gpu for another year
the jump in performances wouldn't be anything significant

Is that Grindthunder? People still play that?

Yea that's what it feels like. I have my card undervolted and running at 1520MHz. It gets me 50+ FPS in 90% of titles in 1440p. Would it better to wait for AMD's next card? There are a couple titles that (to be fair are optimized like trash) that bring my 980Ti to its knees.

>i5 6600k
>1080 gtx
>16gb RAM
>1080p/144hz
>can't clock stable 60fps and get constant stuttering in most games like battlefield 1, call of duty ww2, Squad and most Total War games

Why

Attached: 1531308560581.jpg (567x545, 84K)

we don't know whether or not 7nm vega will come to the consumer market
additionally, we don't know whether or not navi will compete with anything but the midrange
with that in mind, I'd still recommend to wait and see what happens
maybe consider pushing a strong overclock on your current card or even get an aftermarket solution if noise and/or temps become an issue

What's the point in 4k resolutions when the textures aren't 4k

Your processor. Not memes. quad core isn't enough anymore. All those games you listed have shown they love more cores and threads. My buddy has an R5-1600 and GTX-1060 6GB and he gets 120+ fps in BF1 and other such games (granted with shadows turned down) without stutter. CPU usage shows BF1 usually loads up around 65% on 8+ threads on his CPU.

why not a ryzen 2200g tho? is an awesome CPU+GPU and kinda cheap

Alright, makes sense. 56 it is.

:(. I really hope AMD takes all this new found money and starts shoveling it into their GPU division. I'd love to have an All red machine again. My old Phenom X4 965 and radeon 5870 2GB build was a beast.

And yea I could always overclock I suppose. Core clock is maxed out as is but I can get an additional +950MHz on the memory if I add +50mv.

Yep. That's people.
And it's not like there are any alternative simulator games.

You know what's worse? MWO - the rancid corpse of a dead game that's been decomposing for years and still sucks money out of a handful last mech fans.

>back when even garbage tier mediocre 4k 27" monitors were $550+
Two years ago when I got my 32" 4k AMVA panel the professional 27" 4k IPS were 500$.
The ones I was getting all had dead pixels on them and i kept returning them until I got an AOC with 32" AMVA panel for the same money.
It was beyond any expectations, 92% Adobe RGB is still fucking great compared to those 98% IPS ones and the 3000:1 contrast with incredible backlight uniformity are fucking great both for art, movies and gay ming.

4K requirements are also greatly exaggerated.
my Vega 64 chokes on Witcher @ 2160p but runs butter smooth above 60FPS @3200*1800 and such small upscaling is nearly invisible.
Most other games dont struggle with 4k at all.

Attached: 20180806032917_1.jpg (3200x1800, 1.04M)

AMD CPUs are fine on emulators, though.

It's only on some more demanding emulators which only support OpenGL that their *GPUs* are an objectively worse choice.

The thing is that the 980Ti had more ROPs than the 980. Comparing the 980Ti and 980 to the Vega 56 and Vega 64 isn't the same thing.

Vega 56 and Vega 64 are IDENTICAL except for the 8 CUs that are fused off with the 56 (with 56 CUs already being more than 99.99% of games can utilize as it is), and the binning of the memory.
They have the same ROPs.
Texture fill rate is a bit lower on the 56, too, sure, but it's already higher than anything utilizes.
They are pretty much enterprise chips. Neither really has enough raster performance for games and are bottlenecked by that compared to their rest of their capabilities.

Vega 64 needed 86 ROPs or a higher IPC per ROP. They were betting on optimizing driver for EVERY game to go in and code a shader to disgard polygons that aren't visible before drawing them, but that didn't happen so the 64 ROPs are just too few.
So I just don't recommend Vega 64 for gaming period, unless you want it for emotional reasons and the price is the same as 56. They are good for work, but that's it.

That seems low based on Overclockersclub and Tom's Hardware benchmarks. What games?

Because you bought a fucking 4 thread CPU and probably have too much running in the background. I ALWAYS warn against this yet some say I'm full of shit.

My wild guess is that we will get the small Navi next year. Which will be the new Polaris except about as powerful as Vega for the "low end" market.

Later they will roll out the MCM Navi kinda like they launched Ryzen and then threadripper.

what's the best 1080p 144hz monitor i can get for a gtx980?

last time we heard about MCM on GPUs was pretty recent since wang was already chief architect, and it seemed this wasn't even being worked on at the time, which would be needed if anything were to come out two years later
pretty unlikely IMO

What's MWO?
How is WT populated these days?

The textures don't occupy the entire screen space anyway. You look at them from a distance.

Also geometry sharpness and general details.
In Automata you can barely see the details on 2B's leotard at 1080P, 4k reveals them. And many other things, picture in general gets fuckloads more detailed when you make it 4 times larger.

Attached: 20180729175016_1.jpg (3072x1728, 549K)

>Nvidia
> usually moderately greater in capability, performance than AMD
This depends what you're measuring.
AMD, for the cost, vastly outperform with OpenCL, even if the same application has both CUDA and OpenCL options.
And in games, Nvidia takes a massive hit using 10bit HDR which puts AMD way ahead. AMD arch tend to be more future proof and overbuilt.
> G-Sync is better than FreeSync
Not true.
G-sync has a higher minimum standard, but there are Freesync monitors which meet the Gsync standrd (like Nixeus 27edg). Freesync gives you more choice in the end.
> comparable cards of the same "generation" are more expensive than AMD
True if you add in Gsync cost, but not always true otherwise.

>AMD
> generally a little weaker than Nvidia for gaming
Again, this would also depend on which games.
AMD tends to do better in both DX12 and Vulkan, and Vulkan is actually catching on now with UE4 getting support for it which shows 30% FPS increase for AMD cards (but Nvidia heavily invests in Epic Games and they seem to be holding the update back as they're trying to figure out why Nvidia performance in Vulkan is so bad...)
And they also tend to be better in games which are compute heavy, with complex shaders.
Nvidia tends to have better raster performance, and is better in older games that rely on more simple rendering.

Er, you mean like.. twice as powerful as Vega 20? Obviously not Vega 56.

Navi is expected to be around GTX 1080 performance at 1080p and closer to 1080Ti performance at 4k for around $250-$400.
Probably 36-44 CUs, hopefully more ROPs than CUs.

The Wang confirmed that Navi won't be MCM as far as consumer GPUs. But they are working on MCM for enterprise.
He said for them to bring MCM GPU to consumer, it has to be transparent to the software.
This was easy for CPUs, because the software already sees NUMA and multiple cores as separate.
The thing is, a "core" of a GPU really isn't a core. The whole GPU is more like one big core.

>Seems low, what games?
All the big name graphic titles. Metro 2033/last light redux. GTA5 etc. But I'm basing this all off of running game at like full stupid overkill Ultra settings. Soon as I knock down the AA to x2, FPS jumps up.

>last time we heard about MCM on GPUs was pretty recent since wang was already chief architect,

We heard about "Navi - SCALABILITY" on AMD roadmap when they announced Vega.
And very soon after that Nvidia published some papers about possibility of MCM GPU's

I think AMD have been working on Zenifying their GPUS for some time already and Nvidia is trying to catch up.

>What's MWO?
Mechwarrior Online.
A complete shitfuck of a game populated by a few hundredds last sad fans of the series.
>How is WT populated these days?
No idea but searching for the game doesn't take long.

they weren't working on any way to make it invisible to games at the time, which is the point I had in mind to defend
>scalability
Could've meant anything, really

>Er, you mean like.. twice as powerful as Vega 20? Obviously not Vega 56.

Polaris was a way smaller and cheaper while being about as powerful as 290X.
I imagine one Navi chip will be about as powerful as Vega 10 (64) and later they will release a 2 chip version that vorks as one over Infinity Fabric.

Vega 20 looks like a FUCKHUGE 7nm chip, there's no fucking way it's wallet-friendly. Vega 20 will probably be just like Nvidia Volta - not for consumers.

Well shit. I bought the i5 6600k specifically on /pcbg/s recommendation about 2 years ago though.

I'm running a shitty h110m-a mobo so I'd have to upgrade mobo to accommodate a cpu with more than 4 cores. What do you think about an MSI Z370-A? Seems affordable enough

Mostly bought discreet GPU to not put graphics load on APU
Ryzen is more expensive. This was very cheap build - I only had to buy new PSU and used 500 Gb HDD
Probably an error - AIDA and BIOS show different temps

can i plug in the front panel i/o connectors individually or do i have to use the consolidator chip that comes with the mobo

Attached: 1.jpg (873x460, 127K)

Yes you can.
But the consolidator looks fucking handy.
I think I spent 40 minuts fiddling with those fucking little cables after I already installed the other components and couldn't fit my hand inside. Fucking awful stuff.

You can plug individually. And if I'm not mistaken, consolidator thingie is ASUS meme,

yeah first time i was building those front io pins gave me an anxiety attack, nothing the mobo manual didnt cover

thanks for reaffirming, im not sure what the consolidator is for, the individual pins wont be exactly perpendicular either inside the consolidator or directly to the mobo so i dont see a benefit

Bump

Those are 5 year old games.
As much as they're Nvidia favored, and not optimized for resolutions above 1080p, I'd still think you'd get 70+ fps on them at 1440 maxed.

I was backing you up and not contradicting you.
I didn't feel like going into way more detail there.
But basically, with CPU cores, each core has everything it needs to be a fully functional and separate core (with some exceptions with Bulldozer).
A graphics core does not, so there has been nothing with graphics programming to program multiple proper separate cores the way that CPU software does.

For MCM GPU, they'd have to use an interposer which separates the main parts of the core (scheduler, cache, etc) from the cores. Expensive.

Sorry, when I said Vega 20 I meant the 20CU Vega mobile.

Yeah I have only been around for like a year and a half... I remember the recommendations were pretty shit then.
I think people were just mega cucked into some sort of Stockholm syndrome by Intel and recommending garbage because it was hard to justify $100 more just to have HT enabled even though it really does make things a lot smoother.

In other places I was telling people they shouldn't be buying 4 thread CPUs since 2016, that they're really showing how hard they're falling off, and I was feeling my 2500k start to suffer and wishing I didn't listen to people telling me that the 2600k wasn't worth the $100 more, but not here.
>I'm running a shitty h110m-a mobo so I'd have to upgrade mobo to accommodate a cpu with more than 4 cores. What do you think about an MSI Z370-A? Seems affordable enough
I'd recommend you get a 2600X and something like the B450M Mortar instead. A lot cheaper for not much less performance. Though you will spend a bit more on RAM for ideal performance.
A 2600X with fast RAM won't bottleneck a 1080 even at 1080p. Not even close It'll only sometimes slightly bottleneck a 1080Ti, but that's a card you should be using for 1440p instead anyway.

Yes you can.

But why the fuck do you need it?

Should, yes. Those are just dumb circuits that can be open or closed.
Same way that you can trip the power switch or clear CMOS with a screwdriver or paper clip, you can use any basic switch that simply closes a circuit for either.

I guess why not? I don't use my reset button much.

You can plug them in normally. Again, these are very simple passive circuits.

I have effectively unlimited money for my build (up to $10k is fine, even if I have to replace in 5 years), what cpu will get me the best performance for any amount of money? Most of my applications are multi core compatible, so probably amd, but do I want to wait for threadripper 2 or zen 2? I don't know anything about this shit but how to physically put it together.

Zen 2 if you are mostly gaming and shit. Tripper has fuckloads of cores on rather low frequency and it's singlecore is rather low.

Tripper 2 if you are actully doing some heavy computation like rendering shit, editing video, some massive CAD shit, simulating fluid dynamics etc.

Attached: Lux_Sphere1.png (1920x1080, 1.68M)

What exactly am I supposed to be looking for in a monitor? I have an i5 8400 and a GTX 1060 6gb. All I'm going to be doing is gaming on it.

What thermal paste and aftermarket cooler is the best for Ryzen 2600x?

Ryzen 2400g or Ryzen 1600?

That depends on who you are. I look in monitor for color space above 90% Adobe RGB static contrast and backlight uniformity.

If you play CSGO and other online shit you might be interested in 144FPS but that only comes on abhorrently bad panels of for more money than your entire build costs.

If I were you I would buy a 27" 1440P monitor with 60 fps and some good colors and contract.
But if you like playing competitive games on adderal you could be interested in a shitty TN panel with 140hz. Be advised they are all FUCKING TERIIBLE like FUUUUUUCKING terrible.

This is a little fucking autistic, but I mostly do very heavily modded (300+ mods) Minecraft. It maxes out all 8 of my current cores, and I've come into enough money to just throw it at the problem until it's gone.

Thanks. I actually had to use a paper clip after a bad overclock, which my mobo doesn't seem to recover from as well as a bad memory configuration. Currently min maxing everything so not having to open up the case is gonna be useful.

I believe something similar to Dark Rock 4 will suffice, some beefy single tower.

I have a Dark Rock Pro 4 on a 2700X and it keeps it under 70 at stock with quiet fans, and under 80 when overclocked.

But in the other hand the bigger the cooler - the less fan RPM you will need and the quiter it's gonna work.

I'm going to be playing stuff like Fortnite and Overwatch. So less about color quality and more about performance.

Is the downside to those just bad build quality and colors?

Wow dude, go try space engineers.
It's like minecraft but with functioning wehicles.

I dont actually know about MC performance, except that it's broken shit. One of the games that is just made poorly and runs like shit due to bad design.

>B450M Mortar
It's the same price as the Z370-A though

>power button on case has been broken for over a year now
>just using a steak knife to turn it on since then

its better than spending $50 on a new case

tfw no 1950s toaster computer

That's not a toaster, that's a sewing machine

holy shit its HARDAC!!

Attached: Hardac.jpg (796x597, 28K)

>Not slightly stripping one wire then touching the two together
Not all of us have access to fancy steak knifes.

Are you shilling AMD products so people buy them and increase your share price?

EVGA makes good power supplies, right?
newegg.com/Product/Product.aspx?Item=N82E16817438094

>Is the downside to those just bad build quality and colors?

Also viewing angles.
You will need to keep your head firmly in one place. If you lean to side or recline down the picture will invert the colors.

ALSO your eyes will view it from slightly different angles and therefore the picture will be different and that induces a horrible head ache.

And yes, if we're talking monitors limited to sRGB - imagine that the most saturated shade of red you are going to see is orange, yeah the colors are that bad.
By comparassing on 98% adobe RGB the red looks like pointing a loser pointer at your eye.

Attached: TFT-vs-TFT-IPS.jpg (1877x804, 485K)

I'll check it out, using stock cooler and have a rear and top 140mm fan by the cpu but at idle it stays at 95 degrees but I'm pretty sure I need a new cooler for the 2600X

They don't make them, but yes they are some of the best around(Superflower).

Yeah, it's broken, but raw power can overcome that without enough raw power

anything 1080p @ 60hz is the 1060's sweet spot. Maybe 75hz if you drop one or two game settings from ultra to high for a smoother frame rate.

this makes me more angry than it should

Attached: gtx 1060 amazon page.png (1106x534, 282K)

>Z370-A
The Z370-A has a piss poor VRM however. Like the kind of VRM adequate for overclocking a fucking 8350k

Where? The B450M Bazooka is the same board with a different heatsink

>but raw power can overcome that without enough raw power

I remember i once got a 2500K to raw power it, then apparently it bloated out of proportion and needs a fucking tripper to run.

If you're so crazy about MC go google the thread usage and how many cores it could possibly use. I just can't tell you if that game can even utilize 16 or 32 cores.

I've already got a generic dell monitor that's over 5 years old that's 1080p and 60hz. So it's probably not even worth it to upgrade it? I'd really only be playing esports titles so all I'd really want is a higher refresh rate

youtube.com/watch?v=V48KJEP1-sE

>even if I have to replace in 5 years
I guess you could get TR2, and then possibly upgrade to 7nm TR. Intel may have a bit better performance, but also more heat and no upgrade path. Also something about PCIe lanes, but I don't think that's all that relevant in this case.

Any headset pro's here?

Currently using 5 y/o turtle beach x12 headset


Would there be much difference upgrading to a top headset?

>Not posting in the /headphone general/ or /audio general/ or whatever the fuck they call themselves now
poor show

/hpg/ is going to tell you to upgrade to proper headphones and strap a mic on it
they will be right

get an actual pair of cans
get an actual microphone
throw your gaming headset in the trash and feel bad for those 5 years

Question lads

Why are Ryzens recommended in the OP? And on Logical Increments as well?

From what I can tell, Intel's processors are better on single-core performance, and thus are better for gaming. The Ryzens are better at multi-core performance, but who the fuck cares about that? That only matters if you're doing several things at once, which is rare.

Look at pic related. These two chips cost basically the same amount. I'm leaning towards the Intel because I think the single-core performance will be better for gaming, and in emulating scenarios (and I plan on running Cemu, the Wii U emulator).

Attached: benchmarks.png (2560x3200, 980K)

Modern games run better on more cores/threads

Because the OP is made by a shill.
2600 is better than a 8350k which only has 4 cores, though. Get 8400 instead.

Should be able to just slap them direct to the mobo, but it looks like those pins are weird heights to force you to use the consolidator. Cables might sit on the pins funny at the different heights but it should work all the same.

Is the 2600 worth it

is it ok to use R3 2200G or R5 2400G without dedicated card if im not into gaming? just want a solid PC for programming and sometimes for old games?

With b-die, aftermarket cooler, expensive motherboard yes

Would a scythe katana 3 be good enough for the 2600?

Really? Wasn't aware of that. I thought single core performance was more important for games. And also I read an article on Logical Increments about how single core performance is more important for emulation, and like I say, I plan to run a Wii U emulator.

The i3 seems to have significantly better single core performance though. Which I thought would help with gaming and particularly emulation.

Attached: benchmarks.png (2560x3200, 426K)

How do I know who to believe here