Zen2

youtube.com/watch?v=PCdsTBsH-rI
It's fucking happening.

Attached: 1466920344847.jpg (1024x1297, 151K)

Fuck me

Attached: Screenshot from 2018-12-04 17-12-35.png (1920x1080, 1.14M)

why does he speak so funny? is he retarded?

he is a scottish pajeet or something. it used to bother me immensely but his content is too good to ignore.

FERRERO ROCHER GANG REPORT IN

WHAT WILL YOU SNACK AND DRINK ON WHILE WATCHING ADORED'S VIDEO???

Attached: 1529311835737.jpg (757x627, 87K)

Say it with me:

INTEL IS FINISHED AND

Is it just me or the GPU leaks seem way more ridicules? How am I suppose to believe these prices? Or I am just suppose to assume they are not going to remain true even for a second?

cyanide

I believe them. AMD can't take the risk and go full xbox huge dies anymore, Nvidia just gobbles up everything like if everyone was an Appledrone. It's also 7nm; Polaris was 14nm, why is it so ridiculous?

>it takes 16 cores to be a chiplet now
>tfw only 4 cores
This image needs to be updated

Attached: 1543773173651.png (915x678, 353K)

Salt and Vinegar chips with 700mL water bottle.

Attached: bloody nose thumbs up Luigi close to death.jpg (341x313, 20K)

Holy fucking shit that 3850x looks absolutely insane. Feels bad that a x370 board won't be able to handle it, but I'm definitely upgrading to that delicious 3700x.

Attached: smile of approval - friend A.png (656x448, 505K)

>REMENBER

Attached: CANT NO COMPETE.png (714x748, 313K)

kek this. he gets away with it 100%. great content

>ryzen 9 3850x 16c/32t 4.3Ghz/5.1Ghz
holy fucking shit, do want

Attached: 1531756257155.gif (498x482, 3M)

>increasing cores AND clocks at the same time
HOW ARE THEY DOING THIS

kek, this is the best.
The fuck is Intel going to do with fucked fabs against \infty yields?

Fake news. Don't fall for it

Attached: proofs.jpg (576x463, 36K)

3600X for me it is.

Even here he was labeled as an AMDrone, good thing people are accepting him more. Best """tech"""" content by an immensely large margin.

I mean, sure, he praised Nvidia a lot about the new technology, but if the new AMD GPUs are going to be that good for that money for how much longer will people be able to meme about "muh Nvidia mindshares!" So basically might as well have been praising Nvidia for cool looking, but useless currently technology, while AMD are going to be doing crazy good deals both CPU and GPU related.

They better release them fast then. If these leaks are true, then they might be coming out in October depending on how they prioritize OEMs, laptops, etc. I'm itching to upgrading a 480 and that last Navi GPU looks mighty fine.

>mfw needing a new motherboard for that
Might as well grab Threadripper at that point.

Attached: Sad Holo.jpg (1920x1080, 87K)

Node shrink, uArch improvements and switching arch to chiplet design with amazing yields.

He seems to go too far with some of his speculation and it's not grounded in reality sometimes. Like the chopping up the epyc io die for desktop, you can't just take a saw to a die and expect it to work.
This video seems to miss out any 65w parts on the 3700 for example and a 40cu navi would be too big to go on am4. The 'g' cpus are likely to be a 20cu chiplet or a new apu die with 8cores + 20cu plus laptop amounts of io. Either one needing the new design meaning they launch late 2019. I think an apu soc die is more likely as its a direct replacement for rr, there have been no leaks for a 20cu navi yet and it simplifies the supply chain with fp4 bga infrastructure for laptops.

Flagged for antisemitism.

6700k owner here and I'm salivating at that 3700x. Might even try to save up for the 3850x.
The Navi parts look good, but I already have a Vega 64 so I'll keep waiting on that front.

I'd get the 3850x and hold onto it for the next 5 years given that you need a new motherboard.
I have a first gen AM4 motherboard, so 3700x for me.
>tfw cores are not a power of 2

Reminder.

Attached: 1539898335826.jpg (700x5000, 1.83M)

Attached: 1540557837755.png (960x720, 972K)

Amd is increasing core count so fast it's cutting into gpu compute.
At 1024 core cpus gpus lose all advantages except perhaps energy efficiency. Hardware architecture could make a very big circle and go back to the time before 3d accelerators, with everything done on a cpu, just like in wolfenstein 3d times.

Gpus at the end are just several cores (titan rtx has 72) with very wide simd instructions (64 wide in nvidia case), with some floating ops having yet another layer of simd (mostly 4, so 72x64x4 in total).
amd64 simd is roughly like second layer simd on gpus, but each core is ~5x faster (clock+cache) even for embarrassingly parallel compute. For anything that requires diverging code gpus become useless very fast. Developing for cpu is also much easier.

Doesn't sound like a bad idea. I can't see 16c/32t with a potential 5ghz all core OC being beat out on the desktop too soon.

Is it okay if that 5+ Ghz boost clock gives me a boner?
PS: The 3-series is the last one for AM4, right?

InGaAs when?
Graphmeme when?
Optical when?
Vertical stacking when?
Active interposers when?

holy fuck why would you not buy a 3850x AAAAAAHHHH HOLY SHIT

Attached: 1539745731447.png (500x543, 202K)

Do I understand that right that these are just guaranteed boosts and CFR can boost them even further?

*XFR

I hate how they removed the jokes from this image

when the diminishing returns for "conventional" sand and lighting kick in and there's no more trickery or clever hacks to keep it on life support for another decade

lightning*

Attached: 1534737506611.png (915x678, 188K)

Wasn't Rome meant to compete with Intel's 10nm? lmao intel wtf are you doing

Huh, an octa-core APU.

amd 4850/4870
amd 5000 line
amd 6000 price cut line
amd 7970 being better than a titan
the 200 series up till the 900 came out

the entire time amd was better than nvidia, and besides the 200 series also cheaper, the 200 they were 50$ more.

and keep in mind, this was FLAT OUT BETTER, not better in x price range.

On paper. The drivers were trash at the time

think of it, how much does a 1800x cost right now.

drivers for the 4000 were great, had one and never saw issue
drivers for the 5000 were also great, had one and never saw an issue
6000 was an extension of the 5000 but brought cost down
the 7000 was great, and only got significantly better
the 200 were also great, and only got better as time went on. having a 280x myself, and my brother with a 290x and a 290

"King of corelets" should be at 4c/4t. Also add something like this at 8c "Current cutoff for a corelet status. Even consoles have this many cores. Unless you have a over 16 Gb of RAM You're Fucked"

I should mention the driver meme was gpus before the 4000 time, and while the drivers were slow to come out, they were never needed, meanwhile nvidia put out 3 or 4 gpu destroying drivers, so rather have no driver then broken card.

fuck, i might get the 12 core for that splurge factor on a cpu that can raep anything

Careful though. Remember the threadripper issue with game mode?

no, more support after that i read

true. i remember but i didnt read about it extensively bc i only have the 6 core. let me see...

polvorones de almendra y turrĂ³n del blando

If this is true Intel is fucking DONE.

There's no way they could compete with this on current arch.

yea ill stick with 3600x after reviewing or maybe even 3600

>bought a 2600 on black friday
>still isn't here yet despite all of my other components arriving last week
>see this thread

Attached: 1358399438831.png (90x50, 6K)

He didn't
>Just Wait

I feel your pain. Even though I have an r7 1700. But it's really encouraging to see those boost speeds. Really bringing down intel's supremacy of core ghz.

Zen 1 would have been out for two years and a half until we get these chips. What exactly is there to feel pain about?

The pain of not having enough incentive to switch from 1700 because it's awesome enough. But knowing there's something even better I could get for reasonable money (in the near future).

Note that I did switch from 4690k in November 2016. If I wanted 1-2 years the upgrade would be so much better.

An upgrade from a 1700 to a 3700x would be very noticeable.
I'm running the 1700 and love it, but I'm not going to lie and say it's optimal for emulating. It's great for streaming and encoding, but I'd love to not only have more cores, but more than a GHz uplift.

Well you will eventually get Zen 3 or 4 or whatever. At least until then the competitive prices should most likely keep on going so AMD can actually be eating out the market. Also this Depending on what you are using the CPU, an upgrade can be worth it.

>r5 3600X
>8C/16T 4.0/4.8 GHz
>$229
Looks like we have a new 2500k.

KEKED

*fucks you*

>3700X
MUH DICK

I hope my X470 can survive a 3700X, or I'll wait until Zen 3.

say it with me for the ...th time
>intel is finished

Attached: 1523961147184.jpg (510x546, 193K)

The best part is that Ice Lake has been ready for quite a while now, they just can't fab anything bigger than a 2c2t with dead IGPU.

i wanna fug kokoro

Jesus Christ, Amada. I have a 1600 I don't want to upgrade already.

Attached: 1482677344019.jpg (426x341, 99K)

We told you to wait

Attached: 11387.png (1200x1200, 670K)

I want Madoka to put that finger in my ass while I fuck Homura's AI YO

3850x here I come since I finally have a decent job now.
It would be such a fucking upgrade from a i5 4650

the prices aren't going to be that good, the gpu miners are going to inflate the prices. Normal users will be stuck on 14 nm for a while.

them 3600s and 3700s sound doable for my update

isn't gpu mining ded tho?

Margins are extremely thin or negative in the current crypto bear market, yes. 7nm cuts power usage per compute in half. You do the math.

>the gpu miners are going to inflate the prices
Nah, plenty of the big guys are shutting down their farms. Just hope it crashes some more after cashing out at 10k$+

I don't care if Nvidia or AMD makes the better cards, but I really, really, would like g-sync to die.

Read 7nm changes cost-benefit drastically.

Just buy a 3600 when it comes out my man. You will have spent $300 for two processors, the first one comparable with Intel's offering the second one better, and you will have an extra leftover to sell on ebay/give to a friend/keep as a comfy backup. All without any time lost to just wait(tm)ing and without any shekels given to the bad guys.

>6c/12t 4ghz for $100

Sweet tap dancing Christ.

Nvidia are way too stubborn for that. It took them this long to open source PhysX, just one of their gimping feces abortions.

Will intel drop apus once they go with chiplets and glued dies?

i cant fucking understand this nigger

>tfw ETL and can understand him perfectly
He's not even that bad, his earlier videos were even worse!

THREADRIPPER 3.

3950X 32C/64T
3990WX 64C/128T

fuck dude

>nigger
He's a 100% Scotsman, you dumb fuck. A literal fucking Macleod.

Food: red plums (NOT umeboshi), oranges, bananas, Kitkat Senses (European, Caramel Cappuccino).
Drinks: Chinese "Feng Huang Dan Cong" tea boiled correctly with clean mineral (non-carbonated) water. *sips*

Attached: AdoredTV.jpg (500x500, 57K)

NEXT HOLOCAUST

3700X and 3800X actually seems like much better binning due to larger BASED clocks, without "MUH MAX TURBO" memeyry. Basically, these guarantee at least 4.2~4.4GHz stable across all 16 cores & threads at all times 24/7/365 while being very efficient and priced more reasonable than 3850 or THREADRIPPER 2. It's the "1700/X VS 1800X" all over again, essentially. 3850X is a meme offering, 3700/X and 3800/X is where the ACTUAL, REAL value at. And also that 3600G/GX APU...holy fucking shit. Just HOLY FUCK...imagine a ultrabook-sized super slim super light notebook with that monstrosity inside...the literal GENOCIDE of low tier GPU segment.

>bought an x470 board
>didn't pull the trigger on a 2700
dodge a bullet there

>Asuka is RED
>Asuka is #BETTERED
>Asuka was Ruby ALL this time

>why would you not buy a 3850x
Worse binning that 3700/X and 3800/X, according to base clocks. Higher, unjustifiable price. Less efficient on W per dollar. Basically literal 1800X 2.0 all over again. Get 3800X, or 3700/X, don't be a meme victim.

...i-is that who I think it is?

>we have a new 2500k
It's already here, it's called 2600X.

There is absolutely nothing wrong with 2xxx series.