INTEL ON SUICIDE WATCH

WTF IS THAT

Attached: 1546422963983.png (894x250, 52K)

Other urls found in this thread:

asrock.com/MB/AMD/B450 Pro4/index.asp#Specification
twitter.com/AnonBabble

fake news

>low end part is already 6c 12t
true big if

>cores and clock speed equal performance!

is that why Coffee Lake was trashing Ryzen 1 in every real world benchmark?

pay debt

Attached: 1520971059899.jpg (775x759, 150K)

Fake and gay.
>and no Mommy Su closeup to make it all better

Attached: rage pepe.jpg (789x800, 123K)

I'm excited for CES desu
Either AMD delivers BTFOs Intel or AMD doesn't deliver and we'll have a very entertaining shitstorm

How long until new series?

Attached: 1534893703765.png (400x400, 91K)

The next holocaust

>geekbench
>real world benchmark

that and every game released in the past decade. I hope you have a funny meme picture to post in response, that'll show me.

Time to prepare "RIP intel" pics or "Wait for AMD/Navi" pics.

Or both.

here it is

Attached: goybench.png (1183x754, 172K)

Are the 12c and up all multiple die? Whats will most likely be the largest core count single die?

6 CORE 12 THREAD FOR THE ENTRY LEVEL HOLY SHIT ADOREDTV WAS RIGHT

Attached: 1545672612521.jpg (400x600, 188K)

So basically there's zero reason to buy Intel now unless you have a cuckolding fetish

Now let's see those game benchmarks. heheh

only the node size and the amount of cores matter!
Real world performance doesn't!

Attached: 1501441065814 (1).jpg (552x661, 71K)

now i need to see the R3 3300X to BTFO intels 8700k on their stock cooler. just try to imagine the shit flinging after that

Attached: 1529567124804.jpg (757x627, 87K)

Attached: InKEKNPC.gif (750x1250, 96K)

Yeah that will happen.

I HATE AMD AND I HATE ANTI-SEMITIC GOYIM

16 cores is too many cores.

reach for the stars my dude. can't be a pessimistic asshole your whole life long

Attached: lisa_su_pepe.png (1000x1000, 386K)

these rumors are by russian trolls.

report and move on, nothing to see.

>3600X
>8c/16t
>4.8GHz and above
IT'S OVER INTLEL IS FINISHED

Attached: 1536694146733.png (1200x800, 164K)

TICK

STOP MAKING THESE AMD POSTS!! DELETE THIS NOW CUNT!

BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL BUY INTEL

DELID THIS!

wtf would I need 12 cores for?
What a huge letdown, that's completely useless

OYYYYYY VEEEEEEEEEEEEEYYYYYYYYYYYYYYYYYYYY

Well memed.

>not buying the 3850X when it drops for best dies on desktop

Attached: 1541246183696.jpg (480x247, 12K)

>8c/16t
>65w TDP
my body is ready

How will Intel use their (((jew))) powers to make this fail????

More gay comedy actors on planes.

"brand loyalty" goyim

Mine to user, mine to. 2019 is going to be a great year (if it's not bullshit that is.)

AMD isn't fucking around, they really want to bury intel alive. Interesting to see what jim killer does to make intel relevant again.

Attached: 1501764121976.jpg (640x640, 29K)

>tfw cpu industry is now just jim keller trying to one up himself

AMD is the new Cyrix

It's over guys, Intel is finished.

As much as I want to see AMD catch up or even overtake Intel this gen. Some things still don't add up. Can am4 support 16 cores with only two memory channels? Are mainstream desktop users going to buy a 12 core cpu, let alone 16? Is everyone suddenly going to render 3d scenes, encode video on cpu and setup VMs?

No its not to do with EVERYONE. It's simply making high core count chips the standard. Intel has kept people at 4 cores or less for years so that's what devs designed stuff for. HEDT systems had the "high core count" setups (6c12t) but even then, programs didn't take great advantage of everything.

But now imagine 6c12t being the bare bottom standard. Multi-threaded programing would start to become the norm as well. Games did it. (DOOM, GTA5, COD) Notice how quick Intel released their 6 core i5 and 12 thread i7's in the wake of Ryzen? Even Intel sees the end of the road with their "just keep bumping the core clock" method. While everyone may not make the absolute most of the higher core count stuff, it's only a benefit to the rest of us.

It will definitely be bottlenecked by 2 channels but not as much as 32c threadripper on TR platform. It could be mitigated greatly if 4 GHz RAM was supported without OC'ing on B450s successor.

Given how AM4 can use ECC RAM most of the people getting the 12-16c will probably be people interested in cheaper HEDT platforms which will strike another blow to intel's i9 meme platform.

>AM4 can use ECC RAM
Have a source on that? Been wanting to build an AMD system for my server but ECC is something I want. I have a Haswell Pentium, Supermicro board, and ECC ram as is.

ECC ram booting up isn't the same as actually having full ECC support.

Can you guys stop this memory channel on 16 cores meme?

It all depends on how the cpu is made. if its a 2990wx with numa, then no.

but it isnt. its a 16core with shared memory controller. meaning it doesnt fucking matter.

depends on the mobo if it supports it but ryzen itself supports ECC

It's mainly motherboard dependent but yeah even on zen+ mainstream B450s ECC RAM support exists.

asrock.com/MB/AMD/B450 Pro4/index.asp#Specification

>105W TDP
HOUSEFIRE!

At least AMD sticks to their TDPs. Intel clearly doesn't know the meaning of a 95W TDP on their 9900K.

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9DLzkvODA1MjU3L29yaWdpbmFsL2ltYWdlMDA3LnBuZw==(1).jpg (755x561, 109K)

Look at it this way - AMD needs high core count chips to complete against Intel in the lucrative server market. So if they are producing tons of those little 8-core chiplets anyways they have no reason to not to offer high core count chips to consumers; many of them will be made from dies that didn't make the server cut.

Attached: more is better.jpg (755x561, 58K)

>heh, not bad, Jim, you made me use 14+++++% of my power

>Watts (more is better)
Is it?

Yes, of course.

Attached: 1542220191236.png (631x310, 14K)

obviously you don't know what TDP means. TDP is the amount of thermal energy needed to be cooled when the base clock is applied. The idea being that you need to be able to cool 95W in order to get the guaranteed all core frequency of 3.6GHz, so OEMs know what cooler to use to guarantee the base clock.

The power consumption when applying turbo clock has nothing to do with the TDP,

>TDP doesn't matter

please tell me there is gonna be a 3600G

now watch intel still beat them at single core with 14nm++++++

Probably end of the year, APUs always come last.

single core is getting less and less relevant by the day, even for games. only games that run better on intel are old single threaded simulators

yeah that's what he said. do some research on arguing before posting again, retard.

t. coping Incel customer

I believe they have basically the same IPC. The only difference was in frequency.

>higher is better

>Water Cooled Witch Industrial Chiller
N-nani?!

NOOOOOOOOOOOOOO

The R7 1700 already had a 65W TDP, this one's gonna be even better.

Then why does AMD consume less than it's TDP at base frequencies. 2700X will only consume about 65W at 3.7GHz.

The STATE of cpu engineers

Attached: 1465101708732.jpg (800x800, 319K)

>TDP is 95W
>CPU uses 200W at base clocks at high load
You don't think this is a little off spec?

THE MORE WATTS THE BETTER, GOYIM
BUY BUY BUY

AMD also calculates their TDP differently, supposedly.
Whose is more accurate I'm not sure.

TDP doesn't matter

SOMEONE POST IT

AMD have 3 years to do the greatest possible damage to Intel until Intel are going to roll out the Keller's new arch.

According to tech Jesus it's actually the MOBO manufactorers doing whatever the fuck they want with turbo boost and baseclock frequency.

Almost every Intel motherboard doesn't give a fuck about Intel's specs and does it's own thing to turbo more, turbo longer has it's baseclock faster than it's supposed to and so on.

kek'd heartily

Because AMD isn't at the limit of their architecture and manufacturing process.
Personally I assume that they decided to go with the 95W TDP to further their claim that the Ryzen 7 is up to par with Core 7 (since both have the same TDP). A stupid person might assume that 65W TDP might mean a slower product. Maybe they went with it to not get any negative backlash like intel is getting now, since people don't understand that TDP isn't maximum powerdraw in any situation.
Using Prime you can get a Ryzen 7 to draw 150W as well. Is the TDP wrong because of that? Personally I wouldn't say so.

it isn't base clocks, it's turbo clock, essentially the CPU overclocks itself. The TDP only covers the 3.6Ghz Baseclock and that's achievable.

Attached: 1535623854013.jpg (480x289, 16K)

>Because AMD isn't at the limit of their architecture and manufacturing process.

This is all I needed to read right here, thanks.

Attached: 1535623727413.jpg (640x360, 37K)

The ryzen 5 3600x with 10-15% more IPC than 2000 series and a boost clock up to 4.8 is all but tied with the i7-9900K

Ryzen 3300X with same IPC gains and a 4.3 boost and 6cores/12 threads will curbstomp every i5 and run close to an i7 stock.

The 12-16core chips are the real prize though because they offer HEDT power in a non HEDT segment. This is a big deal for content creators and streamers. You need that extra +6+8 cores for rendering and for running background programs including a video stream while you are gaming. Most streamers actually have two separate computers running with one doing video capture for streams. AMD is allowing you to do both on the same platform.

delusion

autistic screeching is not an arguement

>Multi-core doesn't matter!
>Productivity doesn't matter!
>Price/performance doesn't matter!
>Performance per watt doesn't matter!
>Power usage doesn't matter!
>Temperatures don't matter!
>Soldered dies don't matter!
>Stutters don't matter!
>Streaming doesn't matter!
>Data centers don't matter!
>Locked CPUs don't matter!
>OEMs don't matter!
>Hyperscalers don't matter!
>Upgradeability doesn't matter!
>Anti-competitive business practices don't matter!
>Locked platform features don't matter!
>Synthetic loads don't matter!
>PCI-e lanes don't matter!
>Burnt pins don't matter!
>Heat doesn't matter!
>1771w cooler doesn't matter!
>Server space doesn't matter!
>ECC support doesn't matter!
>Free RAID doesn't matter!
>NVMe RAID doesn't matter!
>StoreMI doesn't matter!
>IPC doesn't matter!
>7nm doesn't matter!
>StoreMI doesn't matter!
>HEDT doesn't matter!
>Stock coolers don't matter!
>Security doesn't matter!
>Games don't always matter!
>Enterprise doesn't matter!
>Hyperthreading doesn't matter!
>VMware doesn't matter!
>MySQL doesn't matter!
>Unix doesn't matter!
>Linux doesn't matter!
>Waffer yields don't matter!
>Benchmarks after full patches don't matter!
>Asian markets don't matter!
>Own fabrics don’t matter!
>Chipset lithography doesn't matter!
>Cray doesn't matter!
>Cisco doesn't matter!
>HPE doesn't matter!
>AZURE doesn't matter!
>NEW 5nm doesn't matter!
>NEW TDP doens't matter!
>NEW 10nm doesn't always matter!
>NEW Cache doesn't matter!
>NEW Integrated graphics doesn't matter!
>NEW PCI-Express 4.0 doesn't matter!

Attached: 1545697157024.png (807x745, 205K)

based

How does AMD get twice as many threads as cores?

>pwned

SMT
Simultaneous multi track drifting

Didn't World of Warcraft just get an update to better utilize multicore CPUs? I think Intel is fucked, most games seem to be favoring more than one core now.

fuck you

Attached: ayy.jpg (999x432, 99K)

Reported for antisemitic child porn.

Jesus Christ, how does one giant company go from 800lb gorilla to shit flinging spider monkey in a matter of two years?

Oh yeah, jews.

The real kicker:
You can a 1950X for the price of a 9900K.
Nevermind the coars, just look at all those sexy PCI-E lanes.

They whole jew world order is crashing and burning, and Intel looks like it will go down faster than nvidia.

8