AMD Ryzen 7 2700X 10% Faster Clock by Clock Bench

wccftech.com/amd-ryzen-7-2700x-gaming-benchmarks-vs-1700-at-4ghz-10-faster-on-average/

Attached: Ryzen-7-2700X-Gaming-benchmarks.png (1920x1080, 496K)

Other urls found in this thread:

veg.by/en/projects/nfs3/
youtu.be/BjlN4C658es
twitter.com/NSFWRedditGif

woah fashy stats bro

Attached: DSz08-yXUAEGlnr.jpg (901x1200, 121K)

>10% Faster Clock by Clock

Attached: 1519342456831.gif (480x320, 1.96M)

So still about 15% slower than an 8400, let alone an 8700K?

Intel's IPC is inferior now.

it's not exactly apples to apples here since the 2700X has more aggressive dynamic clock boosting.
still good to see overall though.

Still worse then intel in gaming

>this is what AMDrones actually believe
Can't wait for the real world benchmarks to show it lagging behind even a quad core Skylake again.
>B-B-But we won the Cinebench!!!

Attached: sweetie.jpg (1528x1530, 188K)

They're both at 4GHz dumbfuck.

>Still worse in old games
ftfy

>All those old single thread heavy games!
>Muh 5.3Ghz platinum samples!

> what is XFR(2)

Attached: 1522362379295.png (1296x511, 409K)

Attached: 1522362284756.png (1378x564, 566K)

>wccftech
can the mods ban this plagiarized pajeet poo rag already?

I'm not even going to post Hardware Unboxeds bullshit.

I would rather they ban you.

>Any benchmark where Poozen loses doesn't count!!!

Attached: NOOOOOO.jpg (552x661, 71K)

>AOTS CPU bench gives a 2% improvement
So either the benchmark is shit or that's the actual improvement

>i'm not gay, but 10% is 10%
ok, but i would rather see a power consumption graph

They do count as long as you have a fair balance of games and take account of outliers. When you put a game that is 40+% faster on Intel into the total averages it will skew shit.

5.3?
5.4/5.0 w/ 4000c12

Attached: 13247509457687.png (1437x874, 369K)

Wojak posters must be banned desu, enough of this childish brand wars bullshit

Fake and gay repost

That's about 2x more than Broadwell to Coffelake, so around 4 years?

Disabled when overclocking or undervolting, like any other power management feature of Zen

>gaming
What about applications for grown-ups?

Why do you keep reposting this? Someone won the silicon lottery, why should we care?

10% or so improvement over Zen for everything else.

Yeah, 10% faster with a 1080fucking Ti on 1080p, lmao.

Put a regular 1080 there and that'll drop to below 5%

>Magical nerf ((((updates)))) don't count

Attached: 1515145992968.jpg (691x771, 64K)

>Cherrypicking this hard.

>Poozen magically ahead in CPU-Z benchmark
>Poozen way behind in all real world applications
>Th-There was nothing wrong with the CPU-Z benchmark! Intel paid them to change it!!!

Attached: tfw shitting in the street.jpg (568x612, 66K)

So what are the kikes gonna cry about now?

Probably that AVX512

Attached: 1517674363587.png (868x756, 265K)

Did you forget when 1800x was beating 6950x in productivity despite being 8 vs 10 cores, shill?

Attached: 1492138786706.jpg (1845x1923, 1.25M)

Haha wow, Intlel damage control is out full force. AMD themselves said only a ~3% IPC gain between Zen and Zen+, which is exactly what we see in this guys benches. HOWEVER, we are also seeing performance gains of up to 14% in some applications due to improved memory latency. Intkek shills are just upset because they haven't seen an improvement number between generations bigger than 4% in a long time , ON THE SAME MOTHERBOARD :^). Did I mention the stock cooler than comes with the CPU will likely be good enough for a 4.2 all core OC? Sure, Zen+ won't be as good in gaymen, but I feel good when the company that made the CPU is committed to more than 5% real world gains between generations, dropping prices, giving value (stock cooler), and not fucking me in the ass every year with a new motherboard.

>look mom! I posted an image from HWBot again!

Attached: image_id_1798834.jpg (1680x1050, 531K)

>zen2
>new uarch
>5GHz capable lithography
>going against Skylake core +2% in 2019

It's gonna be a another Auschwitz

>4%
1%*

Attached: 1523829833210.jpg (267x297, 16K)

a real one this time? hell yeah

>tfw I want to buy into AMD's moar coars but they really don't matter because I don't sit at my computer all day running benchmarks

It's a regular 1080, scores should be even better with a 1080ti

No, because that never happened. And of course if you overclock the 6950X to the same clockspeed, it runs away even further over the horizon. Not to mention that it can clock higher than Poozen, making the gap bigger still.

Attached: amd_ryzen_7_1800x_review_-_benchmark.png (960x540, 242K)

Intlel shills still on Ryzen Derangement Syndrome kek.

Attached: 1523844466180.png (1070x601, 495K)

It's fine to pay a premium for the better product.

Where is my 12 nm threadripper, AMD?

Kek kill yourself clinically retarded intel shill, 8 Zen cores > 10 intlel cores

Attached: AMD-Ryzen-1800X-Intel-6950X.png (617x837, 28K)

so with that and the clock improvements we are looking at a 17-18% increase?

Attached: Timestream-Navigator-Rivals-of-Ixalan-MtG-Art.jpg (1000x735, 145K)

ryzen 7 2700x can go to 4.7 ghz with the included wraith prism

you heard it here first folks, from a procrastinating amd engineer

you're welcome

Lol good b8post lad

>ryzen 7 2700x can go to 4.7 ghz with the included wraith prism
>you heard it here first folks, from a procrastinating amd engineer
>you're welcome
proof? everything ive heard is that 2000s got +300ish ghz

Attached: 8EfBsJC.png.jpg (2000x2000, 435K)

Uh sorry to rain on your parade sweetie but you can't improve IPC AND increase clockspeeds in a single generation. Not even blessed Intel® Corporation is capable of that.

Attached: 1499957247669.png (653x726, 84K)

I don't know if people are modifying HWBot results, but if you're running 4000 MHz CL 12 memory there's no way your on air or water. That is impressive even for LN2. Would also explain why he's using the Z170M OC Formula which was record breaking for ram overclocking. The limited power delivery of a micro-ATX Z170 board and the prioritizing of ram OC would explain why it's a relatively low cpu OC for LN2.

Attached: 1505913878055.jpg (960x686, 329K)

AMD boys are still mad. Like. HOW? Hahhahahahah

Attached: 1523843036087.png (900x844, 274K)

Nice, mods are finally doing something about this rampart shitposting.

Wow, there's literally a mod deleting any posts in here that say anything bad about Poozen. Go fuck yourself you street shitting Indian.

Attached: 155.131.1776272476.jpg (320x240, 16K)

He deleted AMDfags shitposting too, retarded drone

Sieg Heil brother!

Attached: ANP.png (726x720, 230K)

>average intel cpu review sample

Post id card or im not believing you

I bought an Intel because I play single threaded old games, for every other activity my old cpu was just fine.

>old

People say that but never quantify exactly how old. I play a lot of NFSIISE for example and cpu performance is irrelevant because any cpu made in the last decade is overkill for it. Hell when the game was made dual core basically didn't exist in the consumer space.

Attached: No invisible barrier to save me this time.webm (720x405, 2.87M)

>1070 vs 2700
>cherry p...

looks fun

Nfs3 is better

It is, but i'm in a NFSII mood.

Attached: 1998 never looked so good.webm (720x405, 2.86M)

oh shit I remember that track from my childhood... there was a shortcut through some cave with an altar or some shit
I think I still have the disk somewhere

>AMD sends CPUs that can hit 4GHz to reviewers
>Most people can't hit 3.9GHz on early bins
>AdorkTV conveniently ignores that.
AMDrones everyone.

Technically you're thinking of the expert version of the track, Lost Canyons. That webm is of the easier version, Red Rock Ridge (the tiem of day gives it away).

Also: if you do find your disc, go nab the modern patch.

veg.by/en/projects/nfs3/

Attached: Aquatica tunnel v2.webm (720x405, 2.02M)

>±100MHz difference
>"golden sample"
Also, most reviewers could only manage these clocks with insane voltages (like Tom's 1.5V). Stop spreading lies, Intelcuck.

thanks user

I don't think anyone doubts that AMD sends golden samples to reviewers.
What people also don't have is doubts that Intel does the same.

problem is that there was no proof, now there is evidence and you faggots freak the fuck out, why?

The Intelfanboys do. Not every 8700K does 5 Ghz at a moderate voltage, let alone more.

I wait for real benchmarks in 2-3 days

Not hitting 4GHz on a 1700 is a motherboard and not CPU issue. Virtually every 1700 hits 4GHz at 1.4v fine on a good board. AMD just sent reviewers good boards and it's no their fault that people buy shitty ones.

How are these not real benchmarks? He directly compares between a 1700 and 2700X locked at the same clockspeed, on the same motherboard, same ram, same eveeything. All over a wide variety of benches. Here is the review where wccftech got their shit from.

youtu.be/BjlN4C658es

>10% IPC on a measly refresh

>what is cache
>what is memory latency

>on an old chipset
>with slower RAM

this, intel really should learn with amd on this one

Tell that to my 1700 on a crosshair hero VI. It will boot at 4ghz but the second you put it under load it crashes the system. I've managed to get it stable at 3.8ghz.

What do you think goys? Replace my [email protected] with a 2700X? I wouldn't need a new cooler as the one that comes with the 2700X is definitely good enough, I have a Taichi so the power delivery is great already, but the problem is I have some shit 2993 2x8g Corsair RAM. I could probably sell the 1700 with the cooler for $250~ CAD and the ram for $140~. The whole upgrade of new CPU + RAM might run me $300~ for what would be a 30%+ performance gain. Then I would just have to wait for used gpu market to crash and pick up a decent card at a deal.

no, wait for the 3000 series for a serious performance jump, and buy everything new by then

up to you.
I'm happy with my 1600X. While it's nice to see such a performance increase from just a "refresh", I feel no need to upgrade.
I'll probably wait until DDR5 to upgrade unless the 3000 series is super attractive.

There's actually rumors that you'll be able to choose between HBM3 or something else and DDR5 for a future Ryzen arch, instead of it being DDR5 only.

>massive fast L4 cache vs DDR4 with double the frequency and double the latency
why is this even a choice?

Once you move HBM off the package it loses the low latency advantage it has over gddr. Ergo if you move it into RAM slots it'll be as bad, probably worse due to higher command delay, as regular RAM.

Locality is important and it's why that Radeon pro ssg (vega) was so special even though it's just accessing memory and storage using dma, which has been a common feature in GPUs since like 2010

higher improvement compared to combined improvement of 3 generations of intel cpus?

b-but muh 5GHz AMD still B-BTFO
[helpless sobbing]

If AMD can pull off a 15% IPC with an actual uarch change in Zen2, then they don't even need 5GHz, chances are they'll probably pull off a 10%+ IPC with zen2 and deliver 5GHz, and 12 cores to boot.

Intel is really fucked in DT

I think it'd be better to work on GHz and overall IPC instead of cramming more cores at lower speeds for now

There's gonna be more cores and how many cores it has has no bearing on the clockspeed since Turbo is a thing.

but is it faster than my 4690k at 4.3ghz for gayming though? as far as i remember the first gen ryzen had comparable performance to sandy bridge in games

Still why not the 1700x if your comparing to the 2700x?
Why not none X to 1700.
I'm confused, I'm probably mistaking something can some one clarify?

I hope it becomes true, Intels eightcore mainstream chip will beat the 2700X. But its still a lot of time until its release and Zen 2 is scheduled for 2019. When does Intel bring 10nm? 2020?

>When does Intel bring 10nm? 2020?
Most likely never.

DELID

Attached: 1495812218308.png (1228x1502, 944K)

All Ryzen7 1XXX chips are exactly the same and X only marks higher quality which only means potentially higher OC

So higher quality 2700x vs typical 1700.. Doesn't seem like a fair comparison. But that's why the GHz are the same right?