That 12 core looks amazing, but where the fuck are the 16 cores?

That 12 core looks amazing, but where the fuck are the 16 cores?
Intel get your shit together, AMD are already sandbagging aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Attached: azLb1a4IrPQuyyNHaPm9RihDS5roZXfMIOTl8e5nKkA.jpg (1148x1147, 224K)

Other urls found in this thread:

youtube.com/watch?v=MkO4R10WNUM&feature=youtu.be&t=242
twitter.com/SFWRedditVideos

The idea that the 16C is being held back to sand bag is just AMD cope.

Of you want 16cores, TR is super cheap right now

They showed the actual 12 core package, user. It's no cope, it does exist.
I could go ahead and buy a 1950x given how cheap they are, but I already have an AM4 motherboard with good power delivery. I'd rather just buy a more expensive CPU rather than buy a motherboard + quad RAM + the 16 core CPU. We don't even know if new Zen 2 TRs will fit on current motherboards, I find that just slightly risky.

Attached: 1558965505098.jpg (1000x563, 106K)

Shit, is that an all core boost of 4.25GHz? If it's the single boost, then it's a little disappointing, but still holy shit that score is the second holocaust

Attached: Screenshot from 2019-05-28 20-24-06.png (615x322, 45K)

>Another skub thread

Attached: Anime Run.webm (480x360, 895K)

I'm hyped over the 12 and 16 core parts, but it's looking like the 3700x is going to be the 2600k of this generation.
>$330
>matches the 9900k at $170 less
>65W TDP
>only one cute chiplet

I feel the 3700X and 3800X are just different binned chips and thus why the TDP is so different. The 3700X valued the 65w package above all, where as the higher 95w TDP of the 3800X I'm hoping implies overclocking headroom.

Currently waiting to buy that R9-3900X. 12c24t of 4.6GHz fury. Gonna see how hard I can push it on a 250w air cooler.

i guess difference is just higher all core boost that s why difference in tdp

3800x probably just has much higher all core clocks to leverage the higher TDP
are the 3700x and 3800x really just one chip and not 2x4c?

3700x is almost certainly the lower binned model, although curious that they're giving the -x sku for a 65w part. it might get a boost from xfr2 (or whatever equivalent they have for this gen) but likely the rated clockspeeds is all you'll get within the 65w package without OC. but still beating the 9900k? it's a steal
the extra 40W on the 3800x for only 100mhz clockspeed seems a bit curious but it probably just has a ton of thermal headroom for xfr2 and such, and will almost certainly OC better. likely will see speeds much higher than 4.5 without OC, maybe not with the prism cooler but with an nh-d15 or clc it'll probably hit 4.6-4.7 on its own

I'm thinking that the 3800X is marketed towards gamers/enthusiasts who will be looking to OC. The 3700X however is for content creators. I.e. someone who is just after a chip for editing photo/video/rendering. The X sku implies the chip is capable of XFR/PBO. The 3700X with its 65w tdp I feel is the absolute best bang for buck in the entire line up thus far. Good stock clocks, X sku will allow XFR to do its job, but the 65tdp means the system won't require exotic cooling and probably won't boost beyond what the stock cooler can handle.

The

I thought Intel had the 10 core part ready, but they only revealed the 9900Kys. I don't like it, we might be looking at end year or perhaps even early next year for the 16 core parts.

So far we have only seen the 8 cores shown as single die chips. And the only time we've seen the dual die, it was the 3900x. Odds are with the fact that there are no quad cores this generation other than the APUs, AMD is either straight throwing the dies which only have 4 cores away, or saving them for in between epyc skus.


7nm yields are at 75% for all cores being good, so odds are they probably don't even have that many dies that have half of the cores being bad.

>repeating total asspulled nonsense from AdoredTV who demonstrably doesn't know or understand a single thing about defect density and binning
LMAO
AMD is not getting 75% known good candidates per wafer.

I thought 3700x is two 4 core chiplets, while 3800x is a single better binned 8 core

I'd think it would make sense for them to use two 4 core chiplets, but they know everything that is happening over there and how it performs. They way Su did the presentation left it a little ambiguous since, to my knowledge, we only saw the package with two chiplets. I think they're still two chiplets, just with disabled cache as well.

LOL

Redpill me on XFR and PBO. I thought that as long as there was still thermal or power budget, an X-chip will continue to boost higher. So even if 3700X is rated 65W, enabling PBO should let it boost as high as a 3800X with the same cooling? Maybe I'm missing something like AMD actually codes in a 65W limit to the 3700X?

It is single chiplet only because the cache configuration is 35mb total of 32 L3 and 3 L2.

According to this dude youtube.com/watch?v=MkO4R10WNUM&feature=youtu.be&t=242 that is a pretty hefty all core OC. You would expect a 4.1GHz OC on all cores, which would be fantastic for multithreaded shit. I'd just leave it alone and let XFR2+PBO do its thing desu. Way higher single threaded boosts with still decent multi core boosts, having ranges(amount of cores used) instead of just two states(single vs. all cores) is pretty cash.

> makes a comparison with a xeon from 2010
> and not even the best xeon from that year
AHAAHAHHA. was this software written by fucking retards to make pathetic AMDrones happy that they bought themselves cancer? amazing.
>b..but AMD use it!
COLOUR ME SHOCKED

user, you OK there buddy?

3700x looks so good from my 2700x and its tdp. My guess is that it's PS5 leftovers, hence the small tdp. Fine by me, it's looking to be quite good.

Nowhere did the 3700x beat the 9900k.

It might. With a little manual OC. Until we know how XFR (?) performs and how much headroom there is nobody knows but AMD.

Attached: 1528964800915.jpg (681x522, 83K)

AMD is still releasing a new threadripper lineup. We'll see 16c on that (probably up to 24 or 32 cores).

They're definitely getting high yields. That's the biggest advantage of AMD's design - use smaller dies with high yields to crank out high core count chips.

Yeah, but on TR4 or just another socket altogether? I wouldn't feel comfortable holding a TR system unless I did HEDT builds every year from scratch, I feel those are at risk of changing socket soon. Don't get me wrong, those are definitely the best chips to get that are non server like the god tier 2950x, but they haven't said anything about TR4 longevity, only AM4. I'd rather keep AM4 with the x370 upgrading to a 16 core and then splurge all in on Zen 3/4 with TR having more threads, possibly 3D stacking DRAM, DDR5, and pcie 5.0.

Attached: merchant.jpg (935x853, 193K)

thats an OC 7940X

good luck coping

Attached: Untitled.jpg (376x759, 80K)

Mark my words. Zen 2 TR will be up to 64 cores, just like Epyc, while ceding 12 & 16 cores to regular AM4. Announced this September, with availability in October. Motherboard manufacturers have been busy getting the new X570 ironed out and ready for mainstream launch and only recently started on the new TR4 boards, hence why it's coming later, since only like 3 to 4% of AMD's sales go to TR.

Don't worry about new sockets from AMD until DDR5 is widely available. So after Zen 3. Remember, DDR5 is only going to be used on expensive laptops at first, hence why Intel is more invested in it. Meanwhile, AMD is going to milk DDR4 right up to 5GHz on the desktop.

guess those 7nm yields are shit, even on chiplets.

truly retards are to intel what flies are to shit

>The idea that the 16C is being held back to sand bag is just AMD cope.
>Of you want 16cores, TR is super cheap right now
If this benchmark is real you would be crazy to buy older TR models. I recall they massively increased FP hardware on zen2, assuming that benchmark is real it looks like IPC is +30% to +50% for 3d rendering, the 32 core threadripper is only 20% faster. Most of the tasks people want 16+ cores for are generally FP heavy as well. It's annoying in general that both intel and AMD insist on releasing HEDT products later.