Mfw it's still not worth upgrading from my 4770K

>mfw it's still not worth upgrading from my 4770K

Attached: 4770k v 8700k.jpg (1280x717, 121K)

same, feels good man. waiting for new 11xx gpus and zen 2 to drop the hammer

I feel you man

>no game list
>1440p ultra high 8x AA bottlenecked by Huang's green log
corelets need not apply

>upgrade from fx 6300 to 8400
Hoping it lasts half as long as the 4770k

Other than power consumption, no it's not really.

It's only worth upgrading if you have Bulldozer or a 4c/4t Intel CPU. 4c/8t are still good for everything outside of heavy workstation/prosumer tasks.

I wish I had spent the extra $100 for a 2600k instead of 2500k but everyone talked me out of it. The last 2 years on that CPU before upgrading to Ryzen were fucking rough.

Over 60fps 1% minimums at any resolution is good. Doesn't matter that 1440p bottlenecks it, it's still good enough. Not a stutterfest like 4c/4t ones.

Got mine clocked at 4.5Ghz. Never could get it stable past that. Probably due to the fact I still want it cool and quiet and minimal power consumption when not in action (Not a fixed clock rate. It just boosts to 4.5Ghz).

>betraying Amada

disgust.jpeg

I thought hyperthreading was shit compared to real cores?

>mfw i still dont need to upgrade my [email protected]

Attached: pepe smoking.png (713x611, 26K)

hyperthreading is essentially like having 5 cores
it hasn't been much of a performance improvement outside of new and poorly optimized games, which I can't stand enough to play for more than a few hours anyway.

I felt the same way, I bought a 3570k instead of the 3770k back in 2012 because there wasnt much of a difference in games at the time. But by the time I built a new rig last week the 3570k was showing its age. I went from struggling to maintain 60fps in battlefield 1 with big spikes of low fps when big shit blew up to maintaining 75 fps with absolutely no drops in FPS.

Didnt want the same thing to happen by 2022 so I went all in on an 8700k this time.

They betrayed me with the 7770hd and fx 6300 betrayed me. Never again

its more like 15% of a core when its not fucking used at all, and up to two cores when it scales perfectly, thugh the most you really see is around 50% of a core.

as for importance, when it comes to games, anything that can absorb a bit of background cpu needs is a godsend when all 4 cores are used.

you don't see the value much in games that use 1-2 cores but its obvious once it will max 4

G O O D
O
Y

i'd call you the goy for being a brand loyalist if anyone

The problem is more with background bloat in modern games like "anti cheat" applications and other drm malware
If some of them didn't have it the stutter wouldn't even be a problem on 4 cores
Windows is also pretty bad for background processes

>heh you prefer a brand that doesn't use shady competition tactics, cares about it's consumers, provides long term support for their products and is linux friendly instead of another company that sits on a CPU vulnerability for years and when the news break out release shitty patches that ruin performance, and right after they literally hire another company to detect mobo exploits, make them pass as CPU flaws and call them RYZENFALLHOWTHEYCANEVERRECOVER... and you're the goy!!

OKAY GUY.

You're going to great lengths to shill ryzen here

>implying you'll be alive in 2022

nah it just pisses me off to see idiots funding companies that are literally the reason the world is going to shit, thanks to sheeps like this the 1% keeps raping us up the ass further and further with each passing day

>[email protected]
I'll be upgrading this around 2030 at best, or when it dies from hardware failure if things keep on progressing like this.

Attached: 1481668607698.jpg (342x298, 40K)

its less the anti cheat and more just background processes in general

>be an average person
>Want to play game
>Have a few webpages open
>Chrome wants between 1% and 25% at any given time for these pages due to javascrip and other bullshit
>Play game
>It uses 4 cores
>It sees you have 4 cores
>It demands 4 cores
>Chrome dumps a 15% load on one
>You get a stutter

honestly, the best thing you can do is sandbox a game to only use 3 cores if you have a 4 core machine, this way programs should default to the one that's unused before they go to the ones that are. god knows this helped with several games in the past when I wanted to record them at the same time trying to go for high score records, and when games use 8 cores fully I will tell it to fuck off from one of the cores just so I can get a better stable framerate.

personally, 7nm would make sense to upgrade too, 5ghz on air as a minimum, that would be an upgrade for almost any system currently out, even if you have a 8400-8700k at 5+, the lower temps and power draw would be worthwhile upgrades.

now, a need to upgrade, probably note,

>get screwed by amd
>buy much better products for my use case from their competators
>have a much better experience
Call it whatever you want, shill. Fool me once, can't get fooled again

If they actually manage to hit that 5ghz or higher level on air and it's not a nuclear housefire, I'll definitely consider it.
Now of course there's no actual need to upgrade from this, but having a snappier system would make for better workflow in creative programs and that upgrade would probably last for at least a decade.
Next step after that is probably the photonic CPU or something.

all I know is performance target for their 14nm was 3ghz for optimal clocks given voltage, and that the 7nm is targeting 5ghz and is on track/ahead of schedule.

shit is at the very least interesting, and from my 1700 would be a reasonable upgrade so I can give my cpu to either a family member or have a second box that I can leave bare for shit takes time rather then cpu horsepower.

>mfw it's still not worth upgrading my 2320

AMD's SMT scales to over +50% in some cases. So having 8 cores can be like having over 12.

>mfw got memed into settling for an i3

Attached: Sigh.png (335x554, 255K)

If I were AMD I would drop a small release of Ryzen+ CPU's. Make them ahard to get a hold of. Wait for Intel to drop their next CPU to compete with Ryzen+ then go 'Oh! Sorry we were only pretending. Here. Have this mass release of Ryzen 2 CPU's instead'.

similar testing shows it's not worth upgrading from a 3570k either
but the fucking thing buckles trying to play a single 4k video (aside from the usual clatter of small-scale background operations)
gaming has barely moved to make use of the plethora of threads modern processors can provide. If that is all you care about, congrats - the devs aren't going to move up in tech for another couple years. That's going to require a total generational shift at AAA companies.

Your a mook. How was going for the lowest of the low budget CPUs convince you it was a good idea or good investment?

That would be unfair to the first adopters of the ryzen+. Amd anit about screwing their customers especially the ones that bought first. But if it were a small enough release the could just mail or give them a "pick up your free ryzen 2 CPU for being such a loyal cuck" coupon kinda thing

I see no point of upgrading my 3570k,managed to OC it to 4.6 but i didn't see much a diff in performance so i down clocked it back to 4.2

>the devs aren't going to move up in tech for another couple years. That's going to require a total generational shift at AAA companies.
The devs need more powerful GPUs getting into the hands of mainstream consumers. If the majority of the market had something with the power of a 1080ti we'd see a change.
CPUs are still quite abit ahead of GPUs relatively. But its GPUs that are holding us back for now. By the time GPUs catch up to what we need CPUs will be ahead. Also remember the majority of the CPU market haven't even moved on to the latest gen CPUs so it'll take awhile still. Atleast 5-10 years before we see big changes with game development

Buy the i7 for your socket then pleb.

why put AMD through the trouble of getting blown the fuck out twice when you can reduce that to 1 time?

OP you are stupid
115 fps @ 1440p is obviously a GPU bottleneck
Those benchmarks are fucking stupid

Attached: 1432890537634.jpg (820x688, 112K)

>implying 4770ks arent almost 500usd now

>tfw it's still nor worth upgrading from my 2600

Attached: 1500437770724.gif (540x311, 1.91M)

>the state of moore's law

Attached: 1499683764186.png (1262x717, 563K)

this

Neither my i5 2320. I probably will upgrade my pc when the Zen 3 is out

Those were nice mid-tier components (lower mid-tier gpu), i don't know why you say that

Dude, how many days with the same charts has it been?

How much intel does pay you to shill?

More like the state on intel with no real competition from AMD until recently

>4770K
>Not 2700K
pleb

>+1 FPS on average
Overclocking is a meme

Bullshit. ONLY if you go by averages. 4c/4t intel CPUs are all stutterfests on newer games. Your lie has been disproven many times by many reviewers.

>buy the best product for the price always
>bought AMD CPUS when they were good, stopped when they are shit
>bought AMD video cards when they were better than nvidia while being cheaper, stopped when it went to shit
>don't buy amd right now because it costs about the same while performing worse
>OMG y u supporting DA jews, people like you make the world like it is
No, if everybody was like me, then nobody would have bought nvidia when it was shit, or intel when it was shit. Yet here is the thing, you don't care, you want people to support the underdog that isn't jewing us hard but what you fail to realize the companies are not your friends, if they have the power and they can jew you, they will. The only reason AMD isn't doing it right now, is because it's on the backfoot, and an underdoor, pissing on their customers is not something they can afford to do, unlike the other two dominant companies.
But keep telling me how Your company is your best friend and they would NEVER screw you over.

On modern hardware that's absolutely true since it's fast enough to not gimp the GPU and comes with high clocks out of the box, so there's not much to gain
2600K is considerably faster after OC.

The amd would run like shit on anything that said nvidia when it booted up. Blame nvidia all you want, all i care about is results. The fx 6300 is well known to be shit, and the a8 apu i had before was complete trash. I learned the hard way why to not buy amd and i will not buy their shit again any time soon

>Gets memed into buying a shit CPU
>decides not to buy AMD processor ever again just when it starts to make sense again, completely disregarding doing research on his own
people like you just can't win, can they

for games and amd still sucks, which is what it is for

yeah they're old as fuck. I have a 3770k that I could OC to 5.1ghz and i can't see the point in buying a new mobo for it when they're either used or 500 dollars too when I have a 4790k that performs a little better without that insane overclock.

I have an i5 4670k at 4.5GHz. Where does that fall?

stutters too much to test now days

>overmeme
Opinion discarded

I don't understand the reference mine doesn't stutter while playing anything?

in the garbage

>a couple of percent slower while being generally faster in most other tasks
>the difference even smaller in newer games
>lower overall platform cost
>sucks
I swear to lord, your kind is why there's so many shit builds that are completely useless in few years time.

>inb4 amd shill
Last AMD cpu I owned was Athlon 64. Since then I owned a Q6600 and 2500K, both turned out to be legend-tier CPUs after a few years.

Well, have fun with your intel build I guess.

>yeah its not great now but in years it will be
Only idiots buy something that performs poorly now hoping it will be better years down the road. Thats buyer remorse backward rationalization at its best.

I don't know man, I'm pretty happy how my HD7970 turned out to be in 2018 compared to GTX680.

The hd7970 was top teir when it released though, ryzen isnt.

7970 went further than the 2500k and 2600k did, not to mention the absolute shit stomping of Nvidia GPUs of the era.

My FX6300, which I've been running at 4.5GHz for the 4 years I've had it now, runs like a champ.

I am going to upgrade to Ryzen 2 because it's time, but holy fuck it has been an excellent CPU.

The fx 6300 at 4.5ghz cant hold above 60fps in gtav. I know because i tried

It was not, GTX680 was released a couple of weeks after and on average slightly faster and more power efficient.
That's why 7970GHZ was a thing, to take back the crown from nVidia.

Yeah but if you wanted 60FPS in new AAA titles you wouldn't buy a $100 CPU from 2012 thats just fucking retarded

>11 Game Average
You seem to have discarded your intellect as well.

honestly if i were to buy a cpu right now i'd sway towards amd because on intels jewish practices

Attached: comment_AnJq1Vej314GAM3x8dYSUMSqy91MQNDJ.jpg (680x719, 203K)

If ryzen ages half as well as the 7970 id call it good. Either way i didnt buy the i5 to keep for 6 years. I will keep it until we get a big cpu breakthrough, or nothing changes and it stays good and i keep it. Im not married to it and i wanted high fps now for my 144hz monitor

>4.8 ghz
horseshit thermal paste in the 4770k dries up after 3 years and you will be luck you get your boost clock underwater then

Haha if you think the 680 was better than the 7970 even on launch, you're super retarded.
A few gimpworks games propped it up, but it was clear what was going on there to anyone who isn't an idiot.

>the waiting meme is real
Few years down the road I buy a new system that shits completely on your fine wine bullshit.

>AMD supports your products with the same motherboard, so you can buy a cpu yearly
at the same time
>AMD ages like fine whine, so even though it performs worse right now, in the future all those cores will be better, you see it's more future proof
AMD tars don't see a problem with holding both of these shilling views at the same time

I have a [email protected] paired with a GTX 970@1531Mhz.
No reason to upgrade whatsoever since everything but ubishit games run at 60FPS at 1080p with Ultra settings(far cry 5 actually do runs at 60FPS though)
I will probably just drop down to High settings since most of the time the image quality difference between Ultra and High is negligible.
I really wanted to fall for the 4K meme but fuck that shit, every fucking monitor is just flawed one way or another.
Maybe one those Nvidia Gsync HDR monitors, 4K 144HZ HDR 65'

What they dont know is amd is not going to release 7nm and all they did was manage to catch up to intel. At most expect 5% tocks for many years, on a product that just managed to catch up to intels tock tock tock

why can't amd just make something that works right away?

Why cant my 4790k go higher than 4.2 ghz without bursting into flames

There is not a single problem with those arguments you brainless idiot. AMD at least supports their platforms unlike Intel who force you to buy a new mobo every time you want to upgrade.

Ryzen is better than everything Intel is offering right now if you plan to use your computer for more than playing games (and even there, you can play everything at more than 60 fps without problem)

You probably doing something wrong.
a 4790K can do 4.4Ghz AT WORST.

>tfw benchmarks don't even list 3570k anymore so I can't compare

I'm starting to feel how old my CPU is now, don't really want to upgrade because it I would have upgrade basically everything except my gpu and storage, even my monitors would become a bottleneck

I had this exact experience with bf1 on my 3570k, it's literally unplayable. I thought it was fine during the beta and at launch but it feels like that patches made my PC perform worse. A lot of newer games feel choppy and stuttery now I've noticed, shit sucks.

Nope, TIM Degrades and cant get above its 4.2 boost, mine used to get to 4.6 before the TIM went bad.

its because you got the intel stuttering, its a phenomena that happens on older cpus that mainly have under 8 threads, like pentiums, i5, and i3 cpus

Mine still going strong 4.7Ghz since 2014.

Either you delided or are lying

>0.23 CryptoShekels have been deposited into your account

Attached: 796098790_193432.gif (240x280, 372K)

Most high end games today use >4 cores handily, along with modern OS task dispatching and improved SMT designs (better to call it SMT as HT is an Intel brand term).

If a CPU made in the last four or five years was designed with SMT capabilities and has them enabled you're much better using a model that has it than one which does not.
Heck, even the 8.5 year old 32nm 1st gen i7's still perform pretty well when they're clocked in the 4Ghz range.

I think you are the one lying, this is the first time I've hard of this TIM drying shit happening to haswell.
I have met people over the overclock forum running at 4.8 with fucking insane voltages for even longer than I and the only complains I think i've ever read was someone who had to down clock to 4.7 because 4.8 became unstable after years, assuming because degradation caused from the voltage.
I'm using stock TIM.

Extreme overclock playing far cry 5

Attached: overclockXTREEEEEEEEEM.png (132x50, 1K)

tough luck.

$500 custom water cooling loop, just cleaned and put new paste on it

Attached: intel.png (193x153, 2K)

that sucks but my CPU is 6 years old so I guess I can't really complain. I'm really curious to see how much better AMD's and intel's next CPUs will be, that's part of the reason I haven't upgraded yet.

>trusting windows 10 task manager to correctly report CPU speed
top fucking kek haswell jizz tim is shit but you're a bigger shitter for posting that

like I said tough luck, I'm using a mere h100i(original version to boot) on mine and I do get higher 70ºc close to 80ºC sometimes.

I upgraded to a 7700K and got at least 10 to 15 fps in pretty much every game I played. I think I got even more in GTAV. The increase in minimum frames alone was worth the purchase for me. The 4770K is no slouch, to be sure.

Not just its guaranteed after a while

why doesn't amd or intel release a cpu that you don't need to overclock to perform well?