AMD IS NEW GAMING KING

RYZEN 7 3800X BEATS 9900K AND HAS HIGHER SINGLE CORE THIS IS NOT A DRILL, INTELFAGS ON IMMEDIATE SUICIDE WATCH

Attached: chrome_cTatb22ma5.jpg (2322x885, 203K)

Other urls found in this thread:

anandtech.com/show/14412/amd-teases-first-navi-gpu-products-rx-5700-series-in-july-25-improved-perf,
nvidianews.nvidia.com/news/nvidia-supercharges-record-80-gaming-laptop-models-with-turing-powered-gtx-16-series-gpus
guru3d.com/news-story/amd-releases-ryzen-5-3600-and-3600x-as-well.html
twitter.com/NSFWRedditVideo

I guess it's true that with Jews you Lose.

>BUT ITS NOT 5GHZ

Attached: 1506507630702.png (992x1043, 614K)

>beats intel on their own turf without even having to resort to housefire clocks
Are 9900K owners the most cucked people on the planet right now?

lel just wait for 10900K

My guess is the 7700k owners. The new zen2 only matches the 9900k which was launched 6 months ago.

So emulation with AMD won't be a meme anymore?

1090watts

So is the 3900x just going to be really tough to cool or what? I was surprised it had the same TDP.

Put me in the screencap, you rascals!

Attached: 69F1F27C-B1F2-4380-91EA-D0B925C15A91.jpg (620x859, 129K)

Matches it at a lower price point, TDP and frequency. If these OC even remotely well they'll be the new de facto gaming CPUs.

How is it there, in 2017?

If you own a 9900k, cost was never the issue as well as being able to afford a good cooler.
However, the new zen 2 is technically superior because of pcie gen 4.

>cost was never the issue
Cost is ALWAYS an issue. It might matter slightly less the higher up the product stack you go, but it never stops mattering completely. These aren't 20 grand server parts we're talking about here.

>intel apologists trying to justify the 9900k's existence
fucking pathetic

Attached: 1527629778452.jpg (679x758, 54K)

2 chips means twice the surface area for cooling

>lower FPS
>beats 9900k
Wut, not to say it's not doing great in that bench but I don't think PUBG was the worst case to begin with. The worst case was single threaded cache thrashing games like FO4

You're talking the difference of $300 on a complete system - $2000 vs $2300
That's nothing especially considering that you'd have this level of performance much earlier.

>lower fps
144 > 147
are you retarded or something

Attached: 1547419788322.jpg (588x823, 117K)

he buys intel, of course he is.

It'll be a hair faster than the 3900X but it will debut with a $600 price tag that will immediately balloon to $1000 because of stock issues

Attached: INTURDSLOW.png (859x556, 3.65M)

How many CCXs per chiplet? Are CCXs still a thing? What's the latency penalty of the I/O die?

yikes AMD and its incel fans are finished

INTEL SWEEP INCOMING 100% MARKET SHARE

was the 9900k running at actual stock or enhanced all-core turbo fuckery "stock". there was nothing happening in that benchmark. need moar benchmarks.

>That's nothing especially considering that you'd have this level of performance much earlier.
I don't know, 9 months isn't a great shelf life even for something that moves as fast as CPUs. Might be nothing to you but I wouldn't project that mindset onto the entire consumer base.

>1600x
not good

more curious about navi stuff. it's finally a side arch from compute.
x1.5 perf per watt compared to vega I take it
can anyone give estimate? them comparing it to 2070 isn't promising
hope 5900 exists.
I am probably getting 3800x to replace my 1600x.

you mean in emulation? because AAA gaming is pretty great on it.

fucking nice i will buy ryzen when my build becomes outdated, i've waited long enough for amd to fuck over intel single core pref

What if I told you I'm happy with my computer and I don't need to chase the latest technologies?

What if I told you you could too?

You're cheerleading for a corporation that barely cares about you. Do you do it for free?

if it's reasonably priced 10% faster than a 2070 is actually really good if it's 500 bucks it's pretty meh though

what if i told you that my ivy bridge can not even run 2 hentai videos after the meltdown security patches
what if i told you i can upgrade to something 230% faster for 50% of price that intel offers

how is that bad thing

>them comparing it to 2070 isn't promising
Depends entirely on how many CUs and how much TDP it uses to go against the 2070.

you are a stupid nigger ive been waiting for a good reason to upgrade since 2013

I'd tell you to post about it in one of the other 100 threads about this bullshit if you're so excited about it.

This is literally what he said. If you actually buffered to read his post instead eating Cheerios with one hand and masturbating to AMD's CEO with the other hand

Yeah, it's great when you turn up the settings so the CPU is out of the equation. Otherwise it's basically the modern Bulldozer.

>I'm poor and I haven't been able to scrap enough money to buy a CPU in 6 years.

Your 4-core CPU has been outpaced by both AMD and Intel for years in every single application written since.

This right here is a major case cognitive dissonance.

It won't be faster overall, Strange Brigade runs better on AMD GPUs. Probably will be neck and neck, until nvidia launches their RTX refresh.

so, always? no drops, no stutter, 1440p
perfect cpu.

>Basically modern bulldozer

Ryzen is nothing like bulldozer, it shares none of the same design.

It's not about not having money. It's about when and how you spend it. Some people did not need new hardware because they stuck with 1080 for the most part. I want to build another system that will stand up for 8 years like mine and many friends still are. Not everyone thinks upgrading every few years is smart or necessary.

It did NOT stand up for 8 years, unless you live in endless-compromise world - you do.

Don't worry Facebook and YouTube will still run the same for you. As will everything else you use.

Attached: 1515197493906.jpg (690x388, 45K)

*flosses on intelboomers*

Attached: 1539298837768.gif (284x264, 69K)

someone competent count the thing compared to vega
is it compute cut down GCN? will they finally fix front end? it was mentioned

Attached: COMPUTEX_KEYNOTE_DRAFT_FOR_PREBRIEF.26.05.19-page-011_575px.jpg (678x181, 18K)

>intel: female character
>amd: male character
interesting

How much better is this than my 4790k?

The design is different, sure. But the performance is still complete dogshit compared to its competitors.

about twice

>ITS NOT 5GHZ

Attached: INTURDBTFODED.png (391x117, 360K)

We don't know and they are intentionally being vague as fuck about it. We'll know more at E3 when they have to spill the beans in about 2 weeks' time. But from what Anandtech has summarized at anandtech.com/show/14412/amd-teases-first-navi-gpu-products-rx-5700-series-in-july-25-improved-perf, there is still enough to make a guess.

My take on it is that it's a bit underwhelming for getting only a 1.25x increase in IPC for a graphics card arch change when even Nvidia advertised 50% or 1.5x with the change from Pascal to Turing. Although Lisa compared what they did with Navi to what they did with Zen. I'm not sure if I want to take RDNA as a new architecture entirely yet, it could just be marketing fluff for the fact they just revamped GCN to fix its most pressing issues, and to downplay the negativity that GCN gets for gaming. They did show the PCIE 3DMark unreleased test and the bandwidth beats the 2080Ti but bandwidth isn't everything so really, gotta wait for more information to get a rough idea, but even then, third party testing will be the decider on whether it is a flop or not.

>when even Nvidia advertised 50% or 1.5x with the change from Pascal to Turing
we know they lied though

fuck gaymen

I hope intel slash i9 & i7 by 75%

So is my i5 8400 shit now? I just wanna play my casual video games on my sff pc.

They didn't. Navi will be slower than Turing-based RTX 2070.

they did, clock for clock is nowhere near x1.5 pascal=>turing

intel has never cut prices and they never will.

How much behind on IPC is Radeon compared to Pascal? I know GCN has a retarded bottleneck so maybe the difference isn't that high once that bottleneck is gone.

it's ahead a little in compute. GCN has bottleneck that doesn't let it draw more polygons
IPC isn't a problem for AMD, front end is. AMD counts faster than it can output

>9900k owners
Didn't you mean 9700k owners?

think of 7700K owners.

Attached: happydoctor.jpg (200x200, 12K)

Le disable hyperthreading in my gaming PC x

>7700K owners
Didn't you mean 7600K owners?

i5-8400 was always shit. You either go full retard and buy 9900k for MAX FPS or buy Ryzen for best perf/dollar and still good enough FPS.

They didn't really lie but it was a best case scenario limited as usual, which is what I think it is. But I finally found the power numbers, 1.5x the performance and 1.4x the power efficiency from Pascal to Turing. If AMD can't advertise numbers close to that even with best case scenarios, I find it hard to believe that Navi will be competing well especially when Nvidia decides they make their 7nm GPUs.

wasn't it called best gaming CPU just a little while back?

Forgot link, from Nvidia's press room for 1660Ti for the numbers I pulled.

nvidianews.nvidia.com/news/nvidia-supercharges-record-80-gaming-laptop-models-with-turing-powered-gtx-16-series-gpus

btarunr knows what's up

Attached: .png (913x348, 120K)

>1.5x the performance
they didn't say per clock. AMD clearly says per clock, also nvidia disguised RTX meme in this metric. did you miss whole reveal? it was such a clusterfuck of numbers and deceptions.
turing is overclocked 12nm pascal without RTX bullshit.

The main problem is that they say per clock but then didn't advertise any clock improvements which I think is weird. Not that I think this is in any way more valid than other performance estimates for Navi but in my opinion with the information we have, I think that is not too bad of a comparison to make.

K stands for Kelvin.

> 9900KillSelf

>but then didn't advertise any clock improvements which I think is weird.
Lisa actually said clock is improved. Rewatch the section.
I agree it's too early, but also I am a little more optimistic after them finally admitting front end problems directly. It's a first time they do compute and gaming separately.

The AMD droning is beyond sad. Intel still leads in IPC all while being on a higher node. This one benchmark from a single game proves nothing. The absolute state of nu/g/ is baffling. 5 years ago you would be laughed at for this blatant stupidity.

>Intel still leads in IPC
I highly doubt that.

Attached: 1558929062636.jpg (2053x1025, 137K)

Most important questieon is can it 50% overclock with cheap cooler like sandy bridge?

Weird game to choose to show off performance, it's dependant on internet speed too so they could have super fibre connection and get a frame boost.

they're both on the same screen retard

And Kilowatts

Nodes don't do anything for IPC, and Ryzen has been winning on IPC since the first iteration. Imagine having a clock speed that's 30% lower and single core performance that's 15% lower

fug :DDD

Attached: 1494660694470.png (801x1500, 1.05M)

what do you need an i3 for?
youll be hearing from my lawyer about that pentium you owe me

Hahahahahahahahah. Intel has a great pr/marketing/bribing department, i'll give them that

They are literally shills. It has infected the board, do you think anyone who had no financial interest in either company would give enough of shit to spam these threads every day?

Someone posted a screen shot of the official Intel shill Web page earlier. You can sign up and get marketing training and go and be a shill on social media for rewards and kick backs. It's sad

>The company didnt reveal TDP, pricing or availability
Typical merchants

>People enthusiastic about technological improvement don't exist
I for one love what Amd is doing and write positively about them. You are siding with a guy that claims Intel still has ipc lead, despite the fact that Ryzen 3000 have better single thread score in Cinebench with lower clocks.

Do you not know how any of this works? Don't force cognitive disonance, it leads to schizophrenia

JEW STATUS: GASSED
ISRAEL STATUS: DESTROYED

Attached: file.png (1920x1080, 2.53M)

hi

Without independent testing you are just parroting marketing promises.

Also the price performance Comparison doesn't look as good when you factor in extremely expensive 3200mhz memory which you don't need on Intel.

Progress is great and competition is necessary but independent testing is crucial and haven't had any of that yet. Plus announced prices won't match retail prices. Hype selling mark up and fleecing early adopters is normal

noooooooooo this can't be happening

Attached: 419wgi3boo031.png (2812x758, 1.6M)

guru3d.com/news-story/amd-releases-ryzen-5-3600-and-3600x-as-well.html

oh fuck you AMD not mentioning this in stream, and here I was thinking I need to spend damn $400
the real gaming king is still 3600x, 3.8 base 4.4 boost is perfect.

By whom? Purch media?

The counterstrike boost alone if true really makes it seem like they have somewhat solved their gaming problems. Should be a very compelling chip all around.

what the fuck does that picture even show?
if their increased performance comes from increased clocks alone its kind of garbage

i don't even understand why companies are using CS:GO, an old game running on the decades old Source engine to display performance boosts.

clocks increased less than 10% though compared to zen+