Zen2 ES: 3.7 base/4.5 turbo

OH NONONONONO

Attached: install zentoo.png (947x112, 37K)

Other urls found in this thread:

ark.intel.com/products/186605/Intel-Core-i9-9900K-Processor-16M-Cache-up-to-5-00-GHz-
twitter.com/SFWRedditVideos

Engineering samples usually clock a lot lower than final silicon.

fuck, subject should've been QS instead

QS actually means Qualification Sample, meaning it is the final test chip beore the production ones. So basically if it's 4.5 ghz we're fucked.

A QS (Qualification Sample) is identical to any OEM or Retail CPU out there. It is identical to the commercial version of a CPU, usually sent out to OEMs to let them validate their hardware, BIOS, ACPI etc etc, with the final chip.

Ah yes, Im sure some random on faceberg has all the inside info

>Rumor
Why did you make this thread?

> we're fucked
Wat? Are you referring to the fact that 9900K is 5GHz in turbo?

>as credible as adoredtv's
probably just made up crap then?

>all-core boost is 4.1-4.3
INCELFAGS ON SUICIDE WATCH

Attached: whatevs.png (852x160, 38K)

>4.5 Ghz when Intel's hitting 5.0 stock

We already knew this wouldn't clock for shit when their best case scenario demo has them only matching Intel while trying to distract us with muh power consumption

the 9900K in the demo was running at 4.7 turbo

Plese no more. It hurts.

4.1 ouch. 9900K is 4.7 all core I believe.

That's good, but ark shows 5GHz. ark.intel.com/products/186605/Intel-Core-i9-9900K-Processor-16M-Cache-up-to-5-00-GHz-

that makes things even worse to Intel then

5 Ghz is only 2 cores max. Cinebench uses everything.

No bros ..... I thought adoredtv confirmed 5.1 ghz ...... this cant be happening ....

Attached: 1479574775361.png (653x726, 42K)

m-maybe it will be a good overclocker

Nooooooooooo not again ...... not another Bulldozer ........ I cant take this shit no more

Attached: ayy.png (649x255, 120K)

So it's actually 4.6 max boost (assume XFR2).

That's not awful and this is more or less fits with what ARM saw from moving to TSMC 7nm, a 5~% boost in max clocks. It's a little disappointing. At a 4.2ghz all core boost we can probably assume about a 7% IPC increase, modest but good for an already high end design. It's possible that there's still a 95W design waiting somewhere out there, probably later down the line when they get a better handle on the 7nm design and have a new stepping. I wouldn't rule out 4.8ghz, since AMD is actually using a less dense library IIRC so there should be a little more room for performance. I would also expect 12-16 cores later too, maybe a half year update or maybe even a refresh in 2020 instead of Zen 3, since we know EUV will only ramp up in late 2019 or early 2020 at the best so I doubt we'll see Zen 3 and the so called "7nm+" on AMD's roadmaps

>can't clock high
>65W
You know that there will be 95W "X" version, right?

why is that supposed to be bad? a little oc and it should run at 4.4/4.9 no problem

Of course, but here it's 4.5 max for AMD.
get down, wojak forces have arrived.

>So it's actually 4.6 max boost
Generation 2000 (XFR2) doesn't work like that anymore. And 3000 will probably be the same.

XFR1 was mostly a hack replaced by Precision Boost 2.0/XFR2.

fuck that scottish cocksucker for making me believe ....

Attached: 1544314957980.jpg (568x612, 66K)

>as credible as adoredtv's

Just a reminder he said 4.8

Attached: AdoredTV.png (1306x839, 800K)

THIS CANT BE HAPPENING BROS INTEL CANT KEEP WINNING

Attached: 1536678083431.png (1300x599, 312K)

What? The image literally says 4.6ghz peak

But the image you posted says the 65 watt part was only supposed to hit 4.4?

I ... will buy it anyway .... fuck intel ...

Attached: 1526565781243.png (1228x1502, 1.07M)

>his sources are as credible as Adoredtv's

Attached: malos think 2.png (170x296, 107K)

Still only hit 4.1

Attached: poozen.png (854x640, 645K)

assuming he's not full of shit, this is the 65W part, so 4.4GHz. not final clocks, as Lisa said

That 16c/32t 5.1Ghz looks really ridiculous now.

Wot? I posted no image, not OP or anybody. Just came, saw you being wrong about 2000 and later Ryzens, corrected you. XFR2 gives no extra 100 MHz anymore, it is included in the advertised max clock.

>Not posting the new leaked spreadsheet
cmon

Attached: new list.png (1483x839, 847K)

Narrator: I was final clocks.

he is, you don't have to be in this industry for long to know jim just made it all up. Or fell for somebody's lies

>4.5ghz?
>what do you need 4.4ghz for?
>help! amd is running at 4.1ghz!

Are you retarded? I didn't say anything about Ryzen 2000. The image I'm replying to is the "leaker" saying the peak boost is actually 4.6ghz. Are you mentally ill?

Ok, guess we were both talking about something else.

So a Ryzen 5 3600 (non-X) cannot clock past 4.5 but matches i9 9900k in multi-core cinebench on shitty clocks, and that is bad?

Noooooo we got too dang cocky again

Don't you know the more cores you have the higher you're able to clock. For some reason.

And at 65 watts too?

BTW last somewhat credible stuff I recently saw suggested 4.0 or 3.7 max boost clocks on samples.
The other stuff is likely all fake.

MT performance and IPC will likely improve a lot though, if 4.0 GHz is what the sample ran at in the CES Demo.

>low speed
its gonna be shit in real life applications ... again ....

yes goyim you need a 5GHz housefire

>less clock
>somehow better
Incredible what amdrones are doing.

The 2700x with a 4ghz all core boost could already hit 1800 in cinebench MT. AMD's HT is better than Intel's by some amount, like 10-15% I think.

But how? If it's matching the 9900k in multithreaded applications it can probably boost higher for single threads. It'll be fine.

Maybe software will finally use more cores soon.

>cinebench score matters
you understand that a 2700x oc can hit around that score and still gets trashed by a 7700k right?

>does more work with lower clocks
>bad

> past 4 GHz
> its gonna be shit
Dude. Seriously now.
Every time I see somebody fapping on single thread clocks, I wonder: what do they need these clocks for? In 2018, when even browsers have tons of processes instead of one big process? Granted, some ffmpeg transcodes take up exactly one thread, but it's when they're experimental and not meant for everyday use. Tons of people here use chinkpads with 2.2-2.7GHz clocks, they aren't complaining either.

you need a fucking miracle for a 2700X to get 2000+CBs

It's fucking over bros. We've been scammed by amd yet again

What the FUCK happened bros? I thought 7nm was going to bring us over 5 ghz .... Intel just keeps beating us even with their 14nm++++++ ......

>MUH NIGGAHURTS
Go back to 2002 you stupid fuck.

I'm done with this shit company . First a 700 fucking dollars card and now this shitty cpu that cant even hit 5 ghz

> 15 posters in thread
> counter haven't changed
You aren't fooling anybody, you shitposting twat. No you for you.

Can hit it at 4.4 GHz IIRC (which does need exotic cooling but point is, it is not that far above its ceiling).

>3,7GHz base
That's embarrasing.

Attached: 1526061229760.png (587x544, 239K)

this is actually the dumbest comment in the entire thread

>low clocks again
wew enjoying getting 30 fps less than intel

>4.1Ghz for AMD
>4.7Ghz for Intel
>They score the same
OOOOOOOOOOOOOOOOOOOOOOOF

So wait, they have a new arch and a node shrink and they still can't match Intel's single thread performance? What the hell is amd doing?

>7nm
>no 5 Gigglehertz

Someone explain.

AMD if fucking finished (And that's a good thing)

AS CREDIBLES AS ADORED

FUKCIN KEK

Also the Intel is a housefire and your money goes to israel

AMD said they're keeping the same TDPs ... that makes those 125/135W chips fake ... correct?

Attached: Thinking Ren.png (290x290, 173K)

they said that about Vega 2, are you sure they said the same thing about Ryzen 3000?

Attached: mmmh.jpg (1280x720, 169K)

Attached: 1_v3vvVO3DuvEB-osQDcIqlw.jpg (600x630, 30K)

>obvious IPC advantage gained
>HURRRR DURRR It's Slow
how stupid can these incells get?

This is a fake image btw.

>R5 literally trading blows with an i9
>somehow bad for AMD
how the fuck are intel shills this retarded?

Embarrasing just like your sexual life.

Ryzen was purpose designed for cost efficient datacenter and extended the design to HEDT. Intel tries to be as general as possible except cost efficient. AMD looks like they might win data centers but intel's still a little ahead in HEDT despite the "housefires" and mobile is Intel's win with no contest, especially since Intel co-authors a lot of stuff that goes into laptops like nvme Optane, 1W displays, laptop form factor references (ultrabooks and flip convertibles)

>spamming MHz myth in 2019

t. Netburstarded

this is all they have, in 2019.

the i9 really has become Prescott 2.0

Are you stupid? A lower mid range 65 watt ES part just beat the 9900k. If the chip's IPC is that much better than Intel's it is they who is fucked.

Attached: 54456.png (261x77, 5K)

>what is speed binning

Each day the scottish cocksucker's fake list keeps getting smaller and smaller

Attached: based.jpg (400x400, 44K)

Hey Tim, when are you going to reply to this?

Attached: 324.png (573x164, 18K)

Stop pointing this out!!! IPC doesn't matter!!!®

Attached: 1515589604418.png (552x661, 288K)

What CES leaks? none of it came true

You pajeets enjoy getting scammed it seems

Attached: 22.png (812x794, 71K)

Maybe now that the core/p3 arch is garbage full of security holes and shortcuts for the sake of benchmarketing, Intel can revive p4 netburst for more MHz? More MHz means best, right intards?

>it beats the 9900k
barely. and in a benchmark that heavily favors amd.
lets face it, people only care about gayman performance and its pretty obvious that the new ryzen is still behind

>heavily favors amd.
since fucking when?

>amd used a 7700k to test games on their vega II
yea im thinking ryzen 3k is DOA

Attached: lele.png (1037x28, 5K)

Good thing we have people in this thread, who know how to make one CPU thread be able to predict results of another threads work.

it's an R5 catching up to an i9. If this still doesn't make any realizations from you, then sorry, you're truly retarded.

or it still has bugs to iron out.
or they've been using the same test bed since forever.
weird that they're not using something like an 8700K or a 9900K, since both of those are vastly higher performance than a shitty 7700K, right?
Or are both of those chips DOA now too?