LAWL DEAD FOR GAMING

DEAD DEAD DEAD DEAD

only 100-115fps at 1080p in ForzaH4


LAWL DEAD FOR GAMING

Attached: dead.png (157x154, 44K)

Other urls found in this thread:

youtube.com/watch?v=CDoepsPwFlA
youtube.com/watch?v=vubnKsKIvsc
youtu.be/bibZyMjY2K4?t=5182
hwbot.org/benchmark/cinebench_-_r15/rankings?hardwareTypeId=processor_5695&cores=8#start=0#interval=50
twitter.com/NSFWRedditGif

I only play Cinebench so this CPU is perfect for me.

>he isnt a content creator

>8 cores
pfffffffffff
hhahahahahahahahhaha

>r5 3600 destroyed i9 9900k

The second chip is a Chinese backdoor

"ALTHOU WE HAVE SOME CHALLENGES I SEE THEM AS OPOTUNITIES"

AM_scuffeD

I will get a radeon 7 thou going to OC that shit to titian speeds.

sandbagged r5 almost low end, bottom of the barrel 8c/16t. can't you fucking see the 2nd chiplet position right below the top one? seriously.

>only 100-115fps at 1080p in ForzaH4
I don't think you realize how shit console games run.

AMD cucked us all. They didn't even tell us which chip they were using to make that Cinebench score.

Intel FUD thread
SELL SELL SELL INTEL STOCK SELL NOW!

ryzen 5 confirmed then

It's the 8 core 16 thread ryzen not final frequency vs stock frequency intel 9900K.

The 8 core 16 thread is probably the 3600X.

Imo Ryzen 5

>ES sample
>show off package which can be CLEARLY spot a area for the 2nd Chiplet

>8c/16T chiplet
>room for 2 chiplets
Likely a 3600x, or maybe 3700x, but btfoing a 9900k with room for one more chiplet is breddy gud

>AMD shills still in denial that Ryzen 3000 isn't 8-cores

KEK

I'm more surprised they didn't talk about any of it, they just said it's happening.

Shame, I think that means the leaks we got were fucked then.

retard they was probably direct die cooling and OC that cpu in the demo to max.

ryzen is dead you can tell by how they showed it and how her voice wavered while talking about it.

sorta feels like ryzen/navi might be in ps5/xbox now I am surprised I thought they would go cheaper shit like some scuffed ARM amazon cpu in consoles. putting a ryzen in consoles will actually help gaming so im not sad ryzen is going to be shit teir for PC.

they didn't reveal what chip was competing with the 9900K.

They purposely did this to make intel sweat more.

The designs are the same as what Adored speculated, so there's a possibility that the leaks are still in play.

they are not releasing a 16core consumer cpu you retard. it would cost 1500$

suure.

Design-wise we now know it is possible. so keep losing sleep at night, incel.

>J...JUST WAIT!!!!!!!!!!

its not about design its about ruining their enterprise market you retard.

Also thought it was cool they released a competitor to the rtx 2080, I think Nvidia finally has to compete again so they wont release dogshit products.

However, I didn't know the RTX 2080 was only 700 bucks too on newegg, so what's the reason to grab this over the 2080 there? I gurantee I can find a 2080 used for much cheaper too

we were waiting for your 10nm, before we could btfo you harder.

You would've said the same thing in 2017 @ 8C/16T processors, I'm sure.

Faggot.

Oh the enterprise market that's getting the brand new Epyc parts with even more cores and better binning?

Retard.

this, and the 2080 will come with both ray tracing and DLSS

well to be a fair the Vega 64 was just as close to the 1080 as the R7 is to the 2080.

and the vega 64 could OC way more to almost 1080ti speeds while the R7 is probably already at its limit by the looks of its cooler.

not trying to hate thou. R7 is probably going to be a sick card I want one but they haven't really court up if the R7 is already running at its wall which it probably is. thou stick a delta fan on the R7 and a CPU cooler and earplugs and its prob going to hit 2080ti/titain speeds of awesome with out having to solder shit onto it like you have to do on nvidia because AMD is open source and not encrypted nvidia shit..

it increased by 25%cores in 2017 not 100% core increase you retard.

R7 doesn't require you to solder shit onto the card to increase power limit for extreme overclocking. because nvidia encrypts firmware while AMD let you flash what ever you want.

if you just want a silent computer get nvidia but if you want to push shit get the R7.

R7 + gen9 intel OC to max will be optimum setup for afue years now I think ICE LAKE will fuck some thing up or encrypt bios to stop extream shit.

I'm not an Nvidia shill because I just want a better competitive market for computer parts so we don't pay out the ass for shit we don't have to.

But I can grab a 2080 for cheaper than this card right now if I look hard enough, don't know how useful an AMD card will be in this price bracket, glad they caught up though.

Guy guys realize this wasn't even the proper Ryzen 3 pre-release show? Just some info about Zen 2.

>9900k at stock frequency
>btfo by an unnamed chip at an unknown frequency

Please. I have a Ryzen 7 chip myself but this test proved nothing. For all we know it was AMD's best Ryzen 7 3700x at stock frequency and everyone is all excited that it did about the same performance as the 9900k with less tdp

I like AMD but this was an underwhelming event, especially considering the GPU price of $700

699 USD MSRP, god fucking dammit AMD you morons. Its going to be the same fucking price as the 2080, on a superior node, with the performance of a fucking 3 year old 1080 Ti. This is a flop. Why not buy an RTX 2080 or a used 1080 Ti?

>he saw the fps when nobody else did

>and everyone is all excited that it did about the same performance as the 9900k with less tdp
That's a big fucking deal. The 9900k is expensive and even a NH-D15 isn't capable of cooling it.

Didn't they say that this GPU was more for content creators with its 16 GB of ram?

>The 8 core 16 thread is probably the 3600X.
Based on the wattage they showed, it was the 3600 (65 W).

What the fuck are you on about? AMD was a disappointment but somehow still triggered some Intel shills

>preview
>ES CHIP
still beats intel on what probably is low clocks

yeah

So AMD didn't release the gaming version or whatever?

Are they stupid?

WHERE IS THE VALUE YOU FAGGOTS KEPT PROMISING

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH
>AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH
>AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH

Attached: 1543310418844.jpg (233x249, 8K)

>announce 64-core Epyc server
>announce Radeon to compete with 2080
>suddenly announce a """midrange""" product

AMDrones are delusional

>7700k to 1800X was 25% increase
lol..?
Or do you mean the 8350? That was just a 4 core CPU in disguise.

>clearly showing a interposer with wiring and a missing chiplet on AM4 socket
>thinks that amd wont fill it up

Hey, this empty space kind of looks weird, do we still have those "silicon spacers" that we used for Threadripper left just to fill the space? I'm showing this off to the world at the CES you know."

"Sorry Lisa, we don't have enough defective chip to use as spacers, our process is too good."

"My disappointment is immeasurable and my entire CES appearance is ruined."

Holy shit NVIDIA is like light years ahead, nvidia still on 12nm btfo everything AMD has, and WHAT AMD DOES?? COMES OUT WITH A 700 DOLLAR GPU CARD !! You can t be more retarded than this.. when nvidia will switch to 7nm, RADEON will be sold to Intel

>still better than 9900K
coping this hard

Attached: 5435534543225343543.jpg (852x480, 106K)

>Mfw I want a PC for a long time now and could get a 9900K for 450$
>But also don't want to miss Zen 2
Ahh fuck what now?

Attached: 1546871731120.jpg (480x480, 25K)

you dont even know its perfomance idiot

AMD DEAD
INTEL DEAD
NVIDIA DEAD

Yeah, ARM is going to kill them all.

I'm in the same boat. I really don't want a high end Intel chip because of their shitty TDP and thermals though. I'm gonna JUST WAIT^TM until we hear more about Zen 2. That 16c/32t version would be so fucking sweet if real

Buy an am4 board with a 2600 for like $200 total and then drop in a 3rd gen when they release

They just showed it s close to a 2080 you dumbass

if you looked better you blind mongloid, you could see that in the upper right there was gpu usage hovering between 95-99% which means the ryzen was bottlenecked by the radeon 7. if it was paired with the 2080ti the fps could be even higher.

forza is just optimized it shows 100% gpu use on titian V and RTX you retard.

DELID DIS

Attached: 1527629778452.jpg (679x758, 54K)

Going to be so funny in a few months when we see the Ryzen 3900 series CPUs with 16core/32thread @ 5GHz. All the Intel trolls are going to suicide.

Attached: Trump_smug.jpg (600x610, 67K)

the value is in sticking it to the nvidia jews :)

IBM future?

>a few months
based non keynote watcher

>youtube.com/watch?v=CDoepsPwFlA
>6700K + RTX 2080 run almost identical FPS as the AMD demo
Help me understand Jow Forums

>max OC chip
>33% less power used than 9900k
ok lol

youtube.com/watch?v=vubnKsKIvsc

Obliterated by overclocked 8700K that sometimes hits 150+ FPS, plus the AMD demo had no other cars in the race.

Explain this amjeets

I don't understand. Was this on the iGPU or something? Why was the framerate so low?

Game runs awesome on my Intel/Nvidia laptop. 140 fps+

Cores don't matter for games just single threaded performance which AMD sucks at.

>Lisa: "we like this game because it really stresses the computing power"
So you're saying they're retarded?

It's completely GPU bound even at 1080p. And they showed nothing to compare it to. It was literally just them playing Horizon for a minute. I guess it showed the CPU actually works but that's about it.

Can AMD get their shit together already im tired of endlessly waiting for zen2 16 core chip. Threadripper seems promising but then they pulled this...

on specific games

thats literally nothing

why didnt they show the entire zen2 lineup?
do they want to keep selling zen+ cpu's until zen2 launches?

Attached: 1528909028680.jpg (451x432, 24K)

No one will buy an 8 core if they know 12 core is around the corner.

the fuck do you think

Their CPU division definitely can. But their GPU division won't unless they prove Navi is worth getting over Nvidia's 7nm GPUs which they will surely announce next year.

wtf is recommended way to apply thermal grease? i hope the lids have outlines of where the dies are

>at 1080p
Why the fuck are you continuin on parroting this retarded bullshit when it was clearly stated BOTH times the DMC 5 and Project Cars 2 were running that they were running at 4K maxed? Yes, that 100~115 FPS was AT 4K ON MAXIMAL GRAPHICAL SETTINGS.

literally doesn't matter. That's what IHS is for.

>BOTH times the DMC 5 and Project Cars 2
except Forza, because they wanted to show off the CPU and not GPU. Watch the damned keynote.

...they never shown Forza. Unless you're talking about X-box? I never listened to that part, was in the toilet. Who cares? It's a fucking consoleshit. I'm talking about PC tests here.

youtu.be/bibZyMjY2K4?t=5182
Take a good listen you deaf braindead faggot

Its to milk the whales and put out a stop gap product until Navi drops. Radeon 7 is basically a gimped version of the Enterprise 7nm Vega II GPU. She kept stressing content creators/gamers. It's not really meant for ultra end gaming like the GTX 2080Ti is. It's basically an HEDT card, where the 16GB HBM2 benefits.

In gaming 8GB HBM2 clocked higher will still be more than adequate today. The pricing sucks, but it's not a gamer focused GPU, as much as you CAN game on it if you wish. R7 has all the AI intrinsics and other GPGPU features enabled. It's an enterprise accelerator for cheap.

But for investors, they can say "we put out a 2080 competing product and its doing great." To maintain confidence so that the market doesn't keep shorting their stock, and they can use the money from those sales, collect interest, and put it towards Navi chiplets.

It is CLEARLY mounting spot for RGBs.

Who cares gaming is for retards (like you).
Gotta say I am glad the monolithic 12C version proved wrong in the end.
This will be worse for games but a 16C will really show Intel its back id actual adult usage haha.

Attached: 1479029273667.jpg (500x550, 59K)

>muh FPS

see

Attached: 1546883587304.jpg (369x496, 27K)

Ok, Jow Forums
what clock do you think that CPU was running at to beat a 9900K at 4.8-5GHz

Attached: gendo.jpg (800x570, 77K)

I'd say Intel was 4.6 (all-core load, while power consumption was "just" 180W at wall) and AMD 4.25 GHz.

Less than 4.4, a 2700X running at 4.4 will score around 2050 and it makes no sense for Zen 2 to perform worse than Zen+

That much? Then Zen2 could run 4.0-4.1 GHz all core in that comparison.

>That much?
hwbot.org/benchmark/cinebench_-_r15/rankings?hardwareTypeId=processor_5695&cores=8#start=0#interval=50
Seems so.

That could be specially tuned for CB performance, while that stage test was not.

>That could be specially tuned for CB performance
Going by my own personal experience with OCed benchmarking CB15 has no real shortcuts to getting better scores outside of fudging the workload which you can verify, and redoing the bench over and over which at best yields a couple dozen points
I mean it's not fucking ice storm where you can multiply your score by running a desktop resolution of 320x240

I run 120fps max on my 1080ti on 1080p w/ stock 8700k i tested today. but on 5ghz is 150

yeah, it is a nice CPU-pure bench.
But wasn't there a cinebench "bias" in some bioses?

>But wasn't there a cinebench "bias" in some bioses?
Only two sources I can find with any number or info are a reddit post which claims 30 points more on his 1700X and this set of test on a 1500X
So yes it's possible but the impact is still fairly limited (5% at the very most) band there's no way to know if it was activated in any of those benches.

Attached: 34240210283_e45ac614d3_o.png (1309x267, 40K)

My point was mainly that hwbot has people trying to break records so extra tuning is likely.
I know an AMD enthusiast that passes time like this and IIRC there are some things you can tune.
Matisse is going to great anyway.