This CPU fucks

This CPU fucks.

techpowerup.com/reviews/AMD/Ryzen_7_2700/

Attached: cpu1.jpg (800x600, 88K)

Other urls found in this thread:

guru3d.com/news-story/eight-new-spectre-variant-vulnerabilities-for-intel-discovered-four-of-them-critical.html
stamfordcheese.com/wp-content/uploads/2017/03/lyburn-gold-cheese.jpg
twitter.com/NSFWRedditGif

Fucks what? Don't stop there man! I'm hypothetically dying to know what comes next.

>fucks
I'll take ten.

Attached: zutto.gif (500x281, 737K)

Tell me something i already don't know.
t. owner of a 2700

>diffused in usa

Attached: 1510247959219.png (10x8, 230)

>please click on my links
Fucking pathetic...

Attached: pajeetpowerup.png (1871x718, 1.14M)

What is this? An image for bacteria?

Huh?!

Fuck who OP? Me ? You? My wallet? My trap collection? What faggot?

Fucks intel in the ass after spectre-ng

guru3d.com/news-story/eight-new-spectre-variant-vulnerabilities-for-intel-discovered-four-of-them-critical.html

Is just loosing to my 7700K in all games, i guess being a corelet is just fine if you don't stream.

It doesn't fuck, it fists Intels tiny butthole

Attached: DSC_0024.jpg (3840x2160, 2.89M)

>tfw waiting for zen 2 on my corelet

How well it fucks?

TOP JEJ

Attached: file-46b2082679966bc703.jpg (600x600, 24K)

>This CPU fucks

Attached: 1519342456831.gif (480x320, 1.96M)

tiny?

I love zem, had a 1800x and now a 2700x, but the meme 20 and now 10c offset is kinda gay, my fans never really spool all the way down, and this on both a scythe fuma and an arctic 240 AIO. Anybody know if there is a way to disable the offset? If not next time I'll just buy the 3700

65780424(((OP)))

Attached: cheese_the_new_milk.png (125x95, 17K)

((()))

Attached: 1525343863446.png (653x726, 84K)

stay mad, gaymerfag

meme it

Attached: xeon_gold.jpg (448x336, 121K)

and the originaru stamfordcheese.com/wp-content/uploads/2017/03/lyburn-gold-cheese.jpg

based 1700x reporting in.

>but muh 144Hz 1080p gaymin

You know, if you're gonna be a sheep and buy a NoVideo 1080Ti to gayme at 1080p pixelfest on a recycled 3D technology, I guess you get a pass at spending too much money on your CPU too.

2500k here, should I wait for zen2 or get the 2700x ?

what an argument. What are you trying to do, bully people into buying your favorite brand?

yes, just like people who start a thread with "this CPU fucks (sic)"

>my 2 years old i5 corelet is still faster than this piece of junk
lel

I was using a 3570k which is only a refresh from the 2500. I jumped the boat and got a 2600x

don't worry, the security fixes for the new Spectre variants will turn your i5 into a glorified FX

>implying I give a fuck about "muh spectre"
Nobody will ever come to my neetcave and have physical access.

No, I'm just saying they're being jewed to the max. They pay a premium for CPU, graphics card, Gsync monitor, watercooling and void warranty by deliding.

Here's the thing:
While 144Hz gaming is nice, it was never meant to be.
It's just that manufacturers found themselves with a whole bunch of 3D screens to recycle after 3D went to the trash.
Before that, we were all happy with our 60Hz display panels.
Realistically, we should have seen 75-100Hz panels pop up before 144.

Let's pause for a second, to let it sink in.

Playing at 144Hz is difficult because it's fucking insane high framerate vs 60Hz we were at just a few years ago.
Don't worryes, user, we invents Gsync. Now you cans have lower than 144 and still have butter smooth gameplay. Oh, it's 100-200$ higher than normal display, though.

So here's the problem:
The same idiots that buy G-sync monitors are the same buying intel+1080Ti to make sure the framerate doesn't drop below 144fps.
What did they pay extra for a GSync monitor for?


If that's what they want, it's fine.

Nobody gives a fuck about your devices, that's right - what matters are the servers everything runs on.

what about freesync monitors? I dont think I got jewed on mine.

This sounds like a conspiracy theory but it's very plausible and close to reality.
If you play "esports" titles (CSGO/Rocket League/Overwatch/Dota/LoL), you don't need more than a quad core or a mid range card (RX470/GTX1050ti) to hit 144hz but there is so much misinfo that people end up buying intel flagships with the best Nvidia GPU available along with Gsync monitors for some reason.

Yeah, freesync is basically free (no pun intended).
The problem comes from wanting to play at 4K.
Vega isn't enough, even with freesync.
Hell, a 1080Ti isn't enough, but then, you even have to maintain a minimum 60fps for vsync.
Believe me, I tried both.
I'm sad because it would be fixed if NoVideo would just adapt the Adaptive Sync standard.

144Hz is 6x24Hz.
It exists because fucking 3D needs to have each's eye picture repeated 3 times to prevent the picture from being dark as fuck, and to prevent your brain from noticing each eye is being shuttered alternatively.
It''s the only reason why those screens exist.

how is it?

Let me answer you in his stead.
Went from 3770k to 1600x to 2700x.

It's the exact same thing. It's just that it handles modern games somewhat better.
You'd be clueless for any other normal task to make up the difference.

>While 144Hz gaming is nice, it was never meant to be.
>It's just that manufacturers found themselves with a whole bunch of 3D screens to recycle after 3D went to the trash.
>Before that, we were all happy with our 60Hz display panels.
>Realistically, we should have seen 75-100Hz panels pop up before 144.

Reallistically having 60fps monitors is a mistake, console players played on their tv with 30fps cap. Reallistically we should have had 45 fps monitors!

No but seriously your logic is terrible, just because somebody wanted to get rid of the monitors doesn't make a 120-144hz minitors a meme

>Before that, we were all happy with our 60Hz display panels.

Imagine being this underage

May I ask why?
Why are we not seeing a whole family of screens from 60 to 144Hz.
And Why did 144Hz 1080p screens just pop out of nowhere when 3D was dead?
I personally am perfectly fine playing on 60Hz panels. I don't play memewatch or or meme-battlegroung though. More single player experiences.
Even if I did, I'd be perfectly fine with 100Hz. And then, I'd have adaptive sync handle the dips below that.

Imagine being 35 years old, and playing Quake 3 at 25 fps , back when it went out.

>Why are we not seeing a whole family of screens from 60 to 144Hz.
>And Why did 144Hz 1080p screens just pop out of nowhere when 3D was dead?
I Don't caaaaaaaaaaare. Nobody fucking gives a shit.
If the 120/144hz monitors were a product of jew shit fermented in money, I STILL don't care.
Nobody gives a flying fuck how can you not get it? It's only good for us that we have better options now, why are you turning this into something it is not.
FUCK you are the kind of retard that will cry about eating eggs because b-b-b-because technically it's aborted fetuses that come from the chicken ass :O who would eat that!?

>I personally am perfectly fine playing on 60Hz panels. I don't play memewatch or or meme-battlegroung though. More single player experiences.
>Even if I did, I'd be perfectly fine with 100Hz. And then, I'd have adaptive sync handle the dips below that.
Goooooooooooood for you, I have had heard console pesants tell me that 30 fps is all you need , I LITERALLY remember the time when people were retarded enough to believe 24 is all the human I can see, because they were retards.
Literally the same shit applies to you, there is a substantial upgrade to be made by going from 60hrz to 120/144hrz and you want to deny reality like it's a meme.
You can easily feel the difference.

>. I don't play memewatch or or meme-battlegroung though. More single player experiences.
There is plenty, PLENTY fast paced games that are single player, there is plenty of older fast paced games that don't have as big of a requirement.

OK, so next step, (((they))) unveil a 500$ 1080p panel @288Hz, and you're just gonna buy it?

I played Quake 3 @ 25fps average on my Voodoo 2.
Best time of my life.

This is neat. Looks like the 12nm dies would actually do well in servers.

Attached: efficiency-multithread.png (500x1130, 55K)

As do all Ryzen cpus

Anyways, 1080p gayming is all that's left to defend buying intel.
You can thank Nvidia for that.

You don't understand.
144Hz panel were a desperate move to amortize their failling 3D technology.
You actually bought into it, yielding more shekels to Intelaviv and NoVideo.
Now we're seeing new 1080p displays being spawned at 144+Hz refresh rates.
Here's my prediction:
You guys will still buy the adequate NoVideo GPU to run your meme game on your 1000Hz display, just because.
Here's a tip, though, just play at 320x240.

Only looks good on TV because of the shading effect.

Radeon RX 580 can play 1080 though?

To be honest Zen should replace any Xeon in existence already.

For now, it is.@60fps. But I've seen a lot of retards buying 1080Tis to achieve max fps at 1080p.

>fucks
I don't know if that's good or bad

it's good if you enjoy 16 dicks in your anus

Funny that 1080Ti is massively CPU-bound at 1080p and is only marginally faster than 1080/Vega 56/Vega 64 at the same resolution.

Intel

>mfw homeserver with undervolted 2700x after upgrading to Zen2 Threadripper

Attached: jazz music turns into free jazz.png (600x715, 327K)

>This CPU fucks.
This is why English language is bad.
In Russian we have хyeвo and oхyeннo, so we are not confused when using profanities.

I enjoy 16 dicks in Intel's anus. Is that good enough?

> hypothetically
Don't you mean metaphorically?

Go home, Ivan. Don't you have memes to post on 2ch.ru?

Attached: slavs-5.jpg (2954x2446, 1.66M)

2ch.ru is dead since 2008 I believe. Okay, fine, I am going home.

Attached: 1524832948481.jpg (384x530, 53K)

he could mean "literally".

Like, he wrote "fuck" which obviously is a litrary device.

Or maybe it "actually" fucks.

Hhhhehehehehehe

Attached: clock_analysis_compared.jpg (975x503, 51K)

>XFR2

Attached: Neat - some Yamatogawa whore.jpg (401x477, 99K)

>OK, so next step, (((they))) unveil a 500$ 1080p panel @288Hz, and you're just gonna buy it?
Likely not, because I got my 144hrz monitor for pretty much the same price as any other 1080p monitor price. The other factor is that 288fps is only a very few games can pull of consistently, so maybe if you were an esports player playing counterstrike with 300-500 fps.
But again, right now we can just about manage the 120-144fps minimums, we can't manage 288fps, except only on a few select esports games and much older games.

Hey I remember playing quake when my video card couldnt render special effects properly, like invisbility.

>Anyways, 1080p gayming is all that's left to defend buying intel.
Correct, but that is not completely accurate. High refresh gaming 120-144hrz is what intel is good at, 1080p is just the resolution that high end GPUS like 1080 and 1080 ti manage to deliver enough performance where the game doesn't become bottlenecked by the GPU before it does with the CPU.

>144Hz panel were a desperate move to amortize their failling 3D technology.
>You actually bought into it, yielding more shekels to Intelaviv and NoVideo.
God, here we go again.
Listen, I repeat I don't care if they are reselling their fail attempt. All that matters is
>1) Does it offer a tangible benefit that I can feel/perceive and achieve with current hardware.
>2)Is the price fair
To both of those critical points the answer is yes. The 144hz monitor I got was around the same price as most 1080p monitors and it was no more expensive than 120hrz monitor, so I got that.
So you can cry about the origins all you want, but all that matters is the end deal for the consumer and it was good.

>You guys will still buy the adequate NoVideo GPU to run your meme game on your 1000Hz display, just because.
>Here's a tip, though, just play at 320x240.
>meme XD
>exagerate as much as possible
>listen up kid :) gonna give you a tip
God you are cancer

lol wait even my 4790k does 8 threads at 4.4GHz.

though on the other hand it flat cannot do more than 8 threads.

>This CPU fucks.
*intel in ass, desu

It gets beaten by a 8700k in both gaming, virtualization and machine learning though. Can you guys read the actual results of what you post and quote?

>2700 OC 4Ghz 5.4kJ
>2700 3.2/4.1Ghz 4.1kJ

wat.

Has anyone tried popping that into the laptop with the R7-1700?

Overclocking is less power efficient. Who knew?
Most power efficient high performance processor by far tho.

if you pay for power efficency instead of performances you can get a macbook or one of those laptops with intel U cpus. If you spend the same money to have less performances and still shill it for the muh delid and muh high temps memes you are not very smart

I guess 9% more performance just isn't worth 50w more power and $30 more for me.

>$30 more
more like $130 for a tower cooler good enough to let it turbo for more than 0.05 seconds

>THIS CPU IS PROGRAMMED TO FUCK YOUR MOTHER

Attached: 1522003198928.jpg (640x632, 59K)

Where is the 2800x?

Attached: __padparadscha_phosphophyllite_and_yellow_diamond_houseki_no_kuni_drawn_by_morino_itsuki__870c5fd6bc (1050x1382, 1.13M)

>This CPU fucks.
What did she mean by this

They're waiting for the 9700k.

Basically if you weren't playing doom on a fucking strobe light you don't deserve to have a nice monitor

Go into your UEFI and set your fan curve manually with the 10C offset in mind, faggot.

Can't wait

>9700k
might be called 8800k on their NEW Z390 Chipset
>a socket each year keep the goyim in fear

it's been only half a year

Attached: flat1143.jpg (250x248, 14K)

I'm gay

Attached: 1899.jpg (1600x900, 108K)

Use precision boost for dem 4.35GHz boosts though

I'm sick of Intel

What's the best AMD cpu for mid gayming?

Intel 8400K

>Intel 8400K
There's no K version,insufferable kike.

>Listen, I repeat I don't care if they are reselling their fail attempt
Then waste money on refurbished failed tech while they laugh all the way to the bank, dumbass. Retards like you are ruining the DIY PC market.

I think the R5 2600 is the best bang for the buck, but it depends of the use you're going to give it. It it's just and exclusively gaming, Intel might still have an edge.
That said, AMD has two things going for it:
>easy and cheap upgradability with AM4, while I think Intel is about to launch a new socket
>big gains in the nearby future with a 7nm process that seems to still be promising, while Intel's 10nm got delayed again while simultaneously, admitted by their own engineers, not having much of a bump on performance.
Those two things alone might offset the slight advantage Intel has on gayms

2600X, it's worth it over the 2600 for the higher boost clocks. See:
The 2600 clocks even lower than the 2700 IIRC, but look at how badly the frequency scales past 4 cores. I can't imagine it being different for the 2600.

Not him but I turned it off because the motherboards like to shoot the processor with a lot of voltage at random times so I just manually overclocked all cores to 4.2

According to an AMD engineer on leddit it's actually better to have the low core voltage during idle and high spikes to 1.4-1.5v than to have your CPU running at say, 1.33v at all times and that the voltage isn't necessarily representative of how the power/load is being distributed. That said, right now mine is turboing to 4.1 under all cores so for a some things that OC is definitely still a bit faster than mine.

I have different voltage steppings depending on the clockspeed
You can manually add it all in the bios, sure it takes a lot longer than adding the multiplier and voltage but I think it's more stable