Is there any point to buying an i3 in 2019? What is the best i3?

Is there any point to buying an i3 in 2019? What is the best i3?

Attached: i3 4130.png (1280x720, 979K)

Other urls found in this thread:

i3wm.org/
ark.intel.com/content/www/us/en/ark/products/134870/intel-core-i3-9100-processor-6m-cache-up-to-4-20-ghz.html
microcenter.com/product/478826/amd-ryzen-5-1600-32ghz-6-core-am4-boxed-processor-with-wraith-spire-cooler
tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-8.html
anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/6
techspot.com/review/1744-core-i9-9900k-round-two/
twitter.com/SFWRedditImages

they have 4 coarz nao

Why pay for i3 when it's free? :)
i3wm.org/

2200G

Attached: 1555429440732.jpg (752x548, 282K)

Does Intel have any autistic AMD memes?

If you can find one for under $50 it might be worth it for budget builds

All they had was the moar cores AMD FX meme, but even now moar cores is the future for both AMD and Intel.

No I don't mean criticism, I mean pathetic displays of insecurity like that picture that people who like to feel clever but dont want to put the effort into actually being clever post instead of an actual point.
Like this post except from someone who wants to defend Intel like this is /v/ and it's the console wars.

What kinds of applications justify upgrading from say, my i3 6100? I haven't had any problems with internet browsing, running virtual machines, playing modern games at medium settings, image editing, etc.

There's not much to be insecure about, AMD hardware is cheaper, if you're spending $500 on a 9900K and all the other stuff to get the most out of it, you're gonna defend that purchase, but you cant go wrong with a 2700X, yeah it may be 30% slower in gaymen, but it's half the money.

No a pentium silver is enough, just make sure to pair it with intel HD graphics if you want to play in HD. Or if you want to be a real pro gamer then get the Nvidia GTX 1010 Ti OC.

This isn't about AMD or Intel or any particular product.
I'm just autistically angry at the level of debate here.
Ya'll motherfuckers are just self satisfactorily sharing image based ad hominem with no actual opposition and patting eachother on the back for it like this is a facebook group or something.
And I'm doing it too. Maybe I just hate myself.

I agree it has gotten pretty bad recently with ryzen 3000 hype and Intel's lack of 10nm

i3 8100 is good if you're a poor-fag making an unambitious build

Can it emulate PS2, Wii, Switch though? I'm mostly hyped for that, and it seems like Emu's vastly prefer Intel&Nvidia

>8100
Why would anyone buy a 65w quad core when AMD has a 65W 6 core with 12 threads.

I guess, but Dolphin ran great for me on the FX-8350 and HD7970.

The best is AMD so you don't get ass raped by IME like a good little bootlicker.

does it tho

Yeah it did, and the 8350 is what we consider a piece of shit, even clocked at 4.5 you're competing with a 2.66GHz i7 920 (with more coars) and it did alright at emulating gamecube games, I could get 60 fps.

lol

the one in the trash.

i3 is good if you're on a budget and just wanna play some esports gaymes, i5 is for playing AAA gaymes, i7 and i9 are for productivity.

This, honestly. Pretty solid GPU and your MOBO will be ready to accept a powerful 12-16 core chip come July, should you ever want it.

I don't understand why you don't think it would be able to

>i5 is for playing AAA gaymes
My MacBook Air gets hella hot

If your image editing/vm stuff is purely casual, then you're fine. If not, you might get value out of an upgrade.

>Is there any point to buying an i3 in 2019?
no
>What is the best i3?
no

ps2, wii, yes.
switch, dunno.

>ps2
Even a Xeon X5450 will emulate that, given you have a decent GPU (at least HD5750 or newer).

I'm of the opinion that no, there is no point
Not because of some AMD loyalist opinion, but the price difference of going up a tier in either camp is usually small enough that adjusting your build and perhaps *slightly* adjusting your budget allows it.

If you're that tightly bound to the budget then I havent got a clue why you'd even consider buying new hardware. And le upgrade path isnt even an argument given how often sockets change

You get buttfucked by AMD's PSP instead.

No. AMD has completely taken over the i3 segment (and most of i5 segment too). Only exception is gaymers who want 150fps instead of 120fps who buy i7/i9 for 200$ premium over R7.

ark.intel.com/content/www/us/en/ark/products/134870/intel-core-i3-9100-processor-6m-cache-up-to-4-20-ghz.html

This is the best i3 atm

if for some weird reason you're still mining cryptocurrency, and you have a rig with 4 GPUs and an intel board, and somehow mine some obscure coin with a bugged miner that needs one CPU core per GPU in your rig, then a second hand i3 can be a valid option to get 4 virtual cores for cheap.

that's the only use case I can find for this shit.

The R6 1600 can be had for $80, why even bother considering a quad-core in 2019?

microcenter.com/product/478826/amd-ryzen-5-1600-32ghz-6-core-am4-boxed-processor-with-wraith-spire-cooler

i3-8350K is a fucking beast in terms of MUH GAYUMZ performance. If only not this fucking 1151v2...

>$170
Did intel release this as some kind of sick joke?

Attached: Screenshot_20190429-135843(1).jpg (720x1067, 193K)

are you retarded?

The 8350K is piece of shit, why the FUCK would you pay $5 less and lose 2 entire cores?

The i3 8100 is decent, I've been using it to emulate games on Cemu.

I'd get an i5 8400 or higher though, some stuff is starting to use more than 4 cores.

i5-8400 is a joke as well, intel really felt the burner under their asses with zen+. They should have released the i5-9400farenheit DAY 1.

Attached: Average.png (1302x1183, 79K)

>The i3 8100 is decent
>literally costs more than a 1600
lmao, wtf inhell

Attached: Screenshot_20190429-144151(1).jpg (720x994, 212K)

i'm contemplating 9400F

does ram mhz really matter that much?!

Not really, true latency is more important. 2666MHz CL 14 has about the same performance as 3200MHz CL 16. Though in general this has a higher impact on AMD processors than intel ones.

I'd probably advise against intel right now desu, even B450 motherboards will be able to accept zen 2 processors when they come out in a few months. You can get a 200GE just to get you by and then make a bigger jump to a $200 8-core with 95% the performance of a $500 i9-9900K instead.

Attached: 6ZSDax7.png (767x441, 45K)

True latency being equal it's always preferable to go for higher clock though because data throughput will be greater. 1600 CL8 will NOT perform the same as 3200 CL16.

Depends how badly optimized the title is, ram throughput should never be a factor in FPS when vRAM can do 100-200GB/s or more.

Also just for reference the i9-9900K at 5GHz is ~15-20% faster than a 2700X AT 1080p. Not sure why this isn't true at 1440p though. Pic related was done with a 2080ti meant for 4K gaming.

Attached: ACO.png (1327x1446, 78K)

i3-8100 is actually an okay processor, unlike its two-core predecessors.

Too little, too late. The damage has been done, see

>Not sure why this isn't true at 1440p though
Because you run into GPU bottleneck situations much more often at higher resolution.

Depends on the use case. If you mostly care about gaymes, i3 is probably still better, and if you need an office machine, it includes a GPU (although in this case 2400G is better than both)

>Uses Assassins Creed Odyssey as benchmark
Are you retarded?

>cemu

Bullfucking shit, how would an i3 run that?

i3 8100 is pretty much the same as i5 7500, why wouldn't it?

Modern i3 processors are just rebranded i5's, it's not a surprise.

All Intel did was up the core count for each lineup.

I3 4 cores
I5 6 cores
I7 8 cores

It's ironic how the MOAR CORES actually beat Sandy Bridge half a decade afterwards later due to vulnerability mitigations.

Not really, both have the same clocks. Zen+ matched cannonlake IPC, that's why the 2600X is neck to neck with the i5-9400Fareneheit, see This is why at least for now the i7-8700K is still the cream of the crop unless you want to throw in an industrial water chiller alongside a i9-9900Kelvin.

It's meant to show how the average modern AAA title will perform, there's no denying everything fucking sucks and devs can barely efficientky utilize more than 2-4 cores in a CPU these days.

So are you saying a 2080ti marketed for 4K gaming is chocking on cocks at 1440p? Because if so someone needs to sue nvidia desu, that's false advertising.

Why not Battlefield 5? Or any optimised title that has no company favour.

>Not really, both have the same clocks. Zen+ matched cannonlake IPC
R5-1600 is not Zen+, and its base clock is lower.

Doesn't represent the average well and gives an unfair advantage to AMD like pic related. Cherry picking is bad on both sides imho.

Attached: OC_SWB.png (1327x1446, 78K)

I think you're confused and need to do more research. How exactly does this work in your mind? If a graphics card can render 60FPS in most games at 4K, then in your mind it is completely and absolutely impossible for that card to reach 100% load at lower resolution?

It does give you console tier FPS at 4K, so that's not false advertising.

zen+ is just higher clocks for the most part, the IPC jump from zen 1 to zen+ was 2-3%. That's why a 1700 was able to curb stomp an i7-7700K.

Attached: Ryzen-R7-1700-vs-i7-7700k-Average-Gaming-FPS-Benchmark-1 (1).jpg (750x410, 36K)

Am I high or did you attach wrong pic? That's a comparison between 9700K/9900K and is also Star Wars Battlefront 2

>That's why a 1700 was able to curb stomp an i7-7700K
7700K performs better in 6/10 games in the image you posted. I wish you brain-dead, sensationalist fanboys would fuck off.

That's not true at all. Pcsx2 uses two cores (at least it used that many the last time I emulated stuff) and they need to be clocked pretty high, like 3.7-3.8. Depending on the game you might get slowdowns even at 4ghz. and once it dips below 100% game speed it fucks the sound up as well. It just takes you out of the whole experience. For example burnout 3 and sotc ran like shit on my i5 2400 which is better than the Xeon you mentioned. Tried emulating on my friends i5 2500k clocked around 3.9 ghz and it still wasn't flawless.
I'm all for amd, because I love a good price/performance ratio, but ryzenbois need to get their heads out of their asses and admit Intel is still king in clockspeed and single core performance. It's just how it is, no matter how hard Intel is jewing its costumers.
That being said, current ryzen cpus shouldn't perform much worse in emulating games than Intel.

the i3 8350k is unironcally a decent CPU

Got one for my brother and paired with a 1660Ti, it can handle pretty much any game at 1080p, and many games at 1080p144hz.

LOOK AT THE CLOCKS

t. i7-7700K owner

see

I got it for like $135 at microcenter.

There's no point in buying Intel in 2019 unless your a gaymer.

that's because it's oc'ed to house fire levels, dummy.

had an i5-6500 up until a year or so ago, then I bought a used 6700k for cheap. I think it was a good upgrade for the money. Save some bucks for the real deal further down the line.

I love how we are all pretending that 7700k is anything but a die shrunk 2600k with a new memory controller and some extra irrelevant instruction set supports.

You forgot the glue smear they did.

Attached: 96vIsCsyN2OasUHT.jpg (1500x844, 382K)

what an utter failure that was. Epyc is eating Intel's lunch at an increasing rate in data centers. Out of the 6 or so systems I only own one with an AMD CPU, that being said my newest Intel CPU is an Ivy Bridge. I can't justify Intel's prices these days.

Yeah, I'm only sitting on an i7-4790k because FX was shit. I don't have loyalty to AMD, but I do despise Intel. Assuming Zen 2 performs like expected, it's time to sell those old parts. Four cores just aren't enough anymore for what I do.

Good thing there isn't much software aside from games and some hardcore professional shit that needs more than that.

>yeah it may be 30% slower in gaymen
?

The 2700X is like 10-15% slower, depending on the game, and faster in most non-gaming workloads. As an analogy, the 9900K is like a Ferrari 488 (but less reliable) and the 2700X is a Corvette ZR1. The 488 is multiple times more expensive, it has the marketing of Ferrari, and it's only slightly faster than the ZR1 in specific use cases while being more or less equivalent in most others.

At this point if you buy Intel you're either bad with money (paying more than double for an extra 10% performance in *some* cases) or a fanboy of the worst kind and should be gassed.

Attached: Elon never called out Jews, but they're there to defend themselves.jpg (1242x1690, 336K)

The only place a 488 can touch a C7 ZR1 is in corners. the ZR1 has 100 more horsepower and only weighs 200lbs more.

Seconding what this user said.

Technically this guy is correct ; higher clocked RAM will have higher throughput, but I'm honestly not aware of any application that's bottlenecked or even slightly impacted by memory throughput. Current-gen processors and graphics cards are simply several orders of magnitude too slow to be impacted by the difference between 370 GB/s and 430 GB/s.

Because of AMD's pathetic results in the graphics department for, what, like 4 years now, GPUs have pretty much stagnated. Proper 4K, like consistent 60 FPS at high settings, is still a bit beyond what we have available.

Once AMD lights a fire under Novidia's ass like they did with Intel, we can expect GPUs to catch up within a year or two.

>OC'd to 5 fucking gigahertz
>just barely beats a processor that launched for 10% lower cost, and very quickly dropped below that
Oof.

Yeah, basically what I was saying. There are certain and very specific cases where the 488 is better in terms of absolute performance, but there are precisely zero cases where it's even close to competitive in terms of price to performance.

What kind of dogshit GPU were you using? My i7 870 and GTX 660 could max at every game I ever threw at it in PCSX2 with the widescreen hacks at 1080p. My i5 2400 destroys the i7 870 in games and emulation.

Oh wow a car analogy.
Now I understand

gtx 750 ti. I know very well emulation is not gpu bound, that's my point. It's extremely cpu limited, especially pcsx2

>imagine being so butthurt about a faster processor than yours that you resort to posting cringe and autistic memes to justify your bad purchase on an anonymous gorilla shagging board

Attached: maxresdefault.jpg (1280x720, 94K)

oen car go vroom vroom

other car do vroom vroo

other car 1/3 price

vroo

>and it's only slightly faster than the ZR1 in specific use cases
So it's nothing like the 9900K then?
The 9900k is faster than the 2700X in damn near anything, sometimes significantly when AVX is involved

Cringe and bluepilled
Ryzen 5s and 7s are great for productivity

Even in gayming its better to buy ryzen because of SMT

>The 9900k is faster than the 2700X in damn near anything
Except for everything that's not a few specific games, yeah.

Are you confusing it with the 8700K?
Here's your non-gaming performance
tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-8.html
anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/6
techspot.com/review/1744-core-i9-9900k-round-two/

The 9900K is only slower in like 3 things and that's only in the techspot one where they cut its TDP to 95watts