THE MADMAN INTELAVIV DESTROYER DID IT AGAIN!

youtube.com/watch?v=qgvVXGWJSiE

IT'S OUT, BOIZ!
GET THE FUCK IN HERE!

Attached: 1543532926643.jpg (184x184, 34K)

Other urls found in this thread:

youtu.be/DJBu4_LZUpk?t=273
anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/8
twitter.com/AnonBabble

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

Attached: 1543594619197.png (989x667, 472K)

That picture reminds me of meme. Please help.

SAVAGE

AAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

Attached: chrome_6TePU7cYn0.png (102x68, 8K)

>DIELID 'DIS
Kek

alright guys, how's it goin'

The speculation here is so detailed that its pretty much guaranteed to be not be what AMD is exactly doing, but it does sound like a solid potential plan. Im starting to get pretty excited for future laptops and ps5 desu

kys jim

>14nm
>die
is that a jab at how intels current process is kill?

>its pretty much guaranteed to be not be what AMD is exactly doing
Adored was right on a shitload of things in the past. Especially EBYN 2 using 14nm I/O with 8 Chiplets. He speculated on that several months before Next Horizon was even announced.
He was also right on RX 590 and 1060 Ti (GDDR5X cut 1080).

He was right on Vega being shit almost a year before its release, too. He pissed off a good amount of fanboys lmao.

Brainlet here, does this mean GPU's are inching towards the wayside?

>Vega being shit almost a year before it's release, too
He was wrong, though:

Attached: GIMPWORSE VS FINEWINE.png (736x736, 1.55M)

>lmao
filtered
no

>filtered
Seething Vega buyer.

Attached: stocking-smug.jpg (500x500, 36K)

>implying

Stop responding to me.

>no
So it's all about AMD's new cash cow that will allow them to afford increased CPU production and finally squash Intel? The video is too long and I'm on break at work.

Attached: 46876585768.png (262x315, 92K)

No, the I/O die is 14nm because the process is extremely mature, cheap as hell, and I/O components in general don't need to be high performance. They aren't clock dependent, so they can go sit on an older, larger process.

In your dreams, sweetie.

Vega is GCN user. Vega is basically an upgraded Polaris. Its a 4-wide 64 CU part. All the driver optimization gained from Polaris also are gained on Vega, and Vega was shit because it was a half-implemented part with extremely unfinished drivers.

Vega over time became a very competitive part, and that's what the graph in question notes. It was really bad out of the box on launch day.

>post yfw new AdoredTV video

/FERRERO ROCHER GANG/ report iN! What are you drinking with your chocolate candies? Pepsi max here

Attached: 1541010030890.gif (718x404, 66K)

That's, that's not how it works.

>Vega is GCN
And? There's nothing wrong with GCN, it's solid.
It was a bit of restrictions, yes, but it's light year ages better than the fucking CUDAtrash.

>Nothing wrong with GCN

It can't scale beyond a 4-wide front-end. Its maxed out at 4096 shaders forever.

Attached: 1528278865072.png (501x462, 77K)

Food - red plums (NOT umeboshi), oranges, Gansito.
Drinks - Chinese "Feng Huang Dan Cong" tea of highest quality boiled with cleanest mineral water. *sips*

Attached: CREATURA.jpg (1479x651, 540K)

>It can't scale beyond a 4-wide front-end.
It can, just more expensive engineering.

Again, I've said it has restrictions, but it's been rock-solid for several generations, so there's nothing wrong with it.

GCN has been absolute crap and a cancer upon the GPU industry. It runs too hot, consumes too much energy, and most definitely is limited in it's architecture. AMD has been too cheap and too lazy to replace it and it needs to go sooner and not later.

Attached: 1503358109734.gif (695x392, 2.24M)

You must be at least 25 to post on Jow Forums, kid.

we know zen 2 is gonna be good, im really interested in its pricing tho, i wonder if amd will go full jew mode with it

The I/O die is a complete game changer. They can literally do anything with it at any scale. That's insane.

Not an argument, and this attitude that GCN is fine when it is clearly not is not going to make the market competitive again.

If it outperforms intels current overclocked house fires then it might. But the chiplets are incredibly tiny and they can probably sell the Ryzen 7nm for pretty cheap if their plan is to gain mindshare among consumers

>vietnamese accent

Attached: 325bd85f71246731e8450415f8566c493be9c823f99551f8814db4e74a7ff00c.jpg (618x530, 76K)

>we know zen 2 is gonna be good
Zen+ is good. Zen 2 is FUCKING AMAZING, PURR-FECT!

>He's a pure Scotsman

>More expensive engineering

AMD does not have the money nor interesting in making 600-700mm^2 dies with 20% yields per wafer. That's also, frankly, outright brain dead faggotry.

>rock solid for several generations
>several generations

GCN launched with the HD 7xxx series. It's been around for 8 generations: 7870, 8870, 280X, 380X, 480X, 580X, and RX Vega.

Nvidia in turn went from Kelper to Maxwell, which was a new uArch, Pascal was a die shrunk Maxwell, and Turing is a brand new uArch again. So in Nvidia have launched a total of THREE new uArches in the time that AMD has been sitting on GCN and doing incremental upgrades to it YoY.

It's completely maxed out in terms of capability. The backend of GCN is perfectly fine, it can do an immense amount of computational things, and the performance there is just fine. But the front-end, when it comes to gaming, is shit. It's been shit for the last 4 years, and will continue to be shit forever because it can't scale worth dick. And the only way to solve it is by making a Turing sized 700mm^2 die, which again, as I told the guy above, is fucking retarded.

GCN needs to go, and something new, completely, needs to take its place.

Lying this hard and pandering to internet racists to defend AMD superiority fantasies delusions. What's wrong with vietnamese people, yes what?

>with 20% yields per wafer
TSMC's 7nm AMD package is already 90% PERFECT yields, did you even watch the fucking video you dumb cockmongling fuckshit?

Are you really this fucking retarded, or do you not understand how yields work?

>7nm
>90% yields
nah

Watch the video, cocksucking fuckshit.

Ok, so you don't understand yields, and also didn't read the whole paragraph.

700mm dies
do not
have
90%
yields

Watch the video, fucking imbecile.

The extreme high yields only happen when the die size is tiny you brain dead glue huffing piece of shit. 700m^2 die sizes would drop that yield down to like 20-30%.

LEARN2READ YOU COCK GOBBLING FAGGOT.

Seems like interesting stuff but I have to focus really hard to understand with the accent

Power usage and heat only matter in mobile

I have never had an issue with his accent and I am a southerner with a strong south England 'Queens English' accent. It must be a burger issue. Maybe burgers only understand 'yee haw!' language or something.

he says multiple times navi won't be multi die in it numb nuts.

>Le 56 minute long video
Get this goblino outta here.

tldr navi is another budget chip, amd still can't compete with xx80 ti ever, only cares about console shit etc.

Probably because you're exposed to it a bit more than me

Watch to the end, you dumb cocksucking shit.

Will we be able to get vega64/1080 tier performance for $250 dollars is all I want to know

The closest I have been exposed to it is Rab C Nesbit. See how if you can follow all of this conversation.
youtu.be/DJBu4_LZUpk?t=273

one can only hope

Wrong. Navi is entry level of tomorrow, but is expected to offer performance between Vega56 and Vega64 (or 1080/Ti) for ~250 dollars.

>only cares about console shit

Because when, back in the day their cards were faster, had better tech, AND were more power efficient than Nvidia, consumers STILL predominantly bought Nvidia GPUs over AMD. This happened for several generations, at point which leadership up top decided "well fuck PC gamers, they're clearly fanboy faggots who love sucking Nvidia dick." And decided to pursue semi-custom, which guaranteed constant revenue YoY for 5-7 years without failure.

If I was running a business and saw the same shit happen, I'd abandon the high and ultra range GPU market too.

>offer performance between Vega56 and Vega64 (or 1080/Ti) for ~250 dollars
[citation needed]

You forgot: "but is EXPECTED"

Stop cherry picking, asshole.

Better 4K 60 FPS console for $500 than $1300 GPJew from Novideo.

hes the describing a hypothetical cheap 200mm playstation soc at the end idiot not a giant desktop gpu to compete with nvidias top end.

>Better 'a'
Seriously though. For gaymen 4K 60 FPS with HDR and graphics level of 'high' rather than Ultra which nobody can tell the difference with anyhow will be suitable for most peoples needs. Only cawdoody and CS:Gay fanboys need over 60 FPS and even that is debatable.

Intlel getting pulverized by a scottish youtuber LMAO

Attached: 1536694146733.png (1200x800, 164K)

4K60 @ High graphics settings on consoles with HDR is good enough. Especially if you're sitting 8-10 feet from the screen on a couch.

Even here in some trolls best case scenarios a much larger gcn chip is barely beating maxwell. AMD punted on making a new arch because the maxwell and pascal performance jumps were so huge. Adored is getting a hard on because maybe jewvidia over reached and went all on on rtx for cycle so amd can catch up if/when its post gcn arch is good, turing doesn't show up, 7nm yields mature etc.

>a much larger gcn chip is barely beating maxwell

Attached: 674858568.jpg (1265x1416, 655K)

The fundamental flaw with GCN is that it can't scale beyond a 4 CE front-end configuration. It's how the uArch is designed. 4 CE = 64 CUs = 64 shaders per = 4096 shaders.

Maxwell & Pascal jumps honestly are irrelevant, because the PC market at large told AMD "you could make a GPU that's 5000x superior to Nvidia, and give to us for free, and we'd still buy Nvidia anyway." It was an unwinnable situation. It's NO SURPRISE that AMD abandoned the high end to pursue markets where they can make money consistently, reliably, and for the long-term.

And also why Lisa took majority of the RTG engineers and focused on semi-custom with Navi for Sony & PS5, with the rest going back into Zen.

72mm^2 8 core Zen chiplets, that's even smaller than the usual 4 core Zen CCX

but real 4k only, not some ugly checkerboard shenanigans. 4k60 is plenty for something like RPG's, very slow paced shooter and literal walking simulator like Death Stranding(assuming it's a PS5 title desu) or Red Dead Redemption 2.
fingers crossed both Sony and AMD can pull it off. for all those sweet exclusives it might actually be worth buying a gayman console

>Vega56 after many many months of driver optimization vs Vega56 @ launch; anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/8

the other user made the point that 700mm^2 dies can never have a high/90% yield rate, salvage able or not.
chiplets on the other hand could have high (but doubt >90%) yields

72mm^2 is roughly 50% the total size of the Zen1 core package. If you exclude everything that would otherwise go into the I/O die.

2xZen2 chiplets together is roughly the dimensions of a full 8c/16t Zen1 die. Which means, with 7nm chiplet, AMD could theoretically (if they decide to do this, and I think they should in the market to completely butt fuck Intel for generations to come), launch an R7 3800X that's a 16c/32t part at 3.8GHz and can all-core turbo to 4.5GHz.

Why do you retards just make shit up like this?
The 7nm chiplet is larger than a 14nm CCX, substantially.

A 70mm2 die isn't small for the 7nm node either, 5billion+ xtors could fit in a 70mm2 die. TSMC's 7nm node has more mask layers, more quad patterning, and it hasn't been in volume production very long. I doubt Apple saw over 30% for their 7nm SoC.
It would be a miracle if AMD was seeing 50% good candidates per wafer mid next year. A miracle.

good thing AMD can bin/reuse dies with defective cores for lower end. but i think the initial yield is very low for 7nm, as you said

Attached: 1541538120178_0.jpg (818x693, 91K)

>GCN needs to go, and something new, completely, needs to take its place.
It's an ISA.
Does x86 need to go?

Yeah, it needs to be redesigned from the ground up like Zen was. Next-gen GPU arch needs to be to GCN what Zen is to Bulldozer.

AMD need to change their marketing to something honest.
'AMD. They're boxy. But good.'

Zen is still x86.
Are you a dumbass or merely pretending?

Attached: 1.png (775x287, 15K)

Can anybody give a summary? I'm not watching almost an hour of that awful accent.

He doesn't understand the power of FINEWINE

>Intel this buttblasted.

That's not how memory PHYs work.

Zen's a new uArch dipshit. GCN needs a redesign. Reading's clearly a gift not given in your family.

AMD products are always rough at launch.

AMD, with chiplets & I/O die can tackle Intel's 300M chips fabs capability by using every last possible defective and partial die a wafer can produce, with near 100% capability, across all product ranges, courtesy of speed binning and elder god level of engineering all focused on the idea of "we have X dollars to spend, how do we completely obliterate Intel on that budget?"

With Chiplets & I/O they can at yield tackle Xbox Scarlet, PS5, Enterprise, HEDT, Desktop, Mobile AND embed markets using a single design. A level of reach that neither Intel nor Nvidia has.

All this time Intel and Nvidia have been jousting against each other on horseback. SUDDENLY, AMD enters the scene, with the largest bulge in the armor any fighter has ever scene in their entire existence; and they whip out out, a cock so colossal all the panties of every lady within 80,000 light years are flooding with the intensity so great it could fill oceans in minutes. Intel & Nvidia both turn to AMD, ready to drive both their jousts into its body. AMD smirks, reaches behind it and SUDDENLY, 7 different joust spears cover the entirety of AMD's attack surface.

Intel is shitting a fucking brick, you can smell it from across the colloseum. Nvidia is enraptured by 1 sample ppx ray tracing and the reflection of AMD's 7 joust spears across its own armor and its and Intel's own joust spear; completely oblivious to the fact of the sheer fucking destruction AMD is about to bring to the fight.

By 2020, the entire CPU and GPU market will have been turned on its head. ZEN, coming full circle; AMD returns to the top.

Vega is newer than a 1080 and a larger chip than a titan x, shill. The amd x90 series used to cost $5-700 bucks and competed directly with nvidia x80 series, gcn got so btfo by maxwell arch its been reduced to matching up with nvidia 60 series in the $200 with 2 aaa games included bargain bin.

blaming the consumers is retarded user, go run any business that way and stick your fingers in your ears and keep repeating how good you know you are, while the market tells you otherwise.

Retarded buttmad noVideot detected.

>Nvidia is enraptured by 1 sample ppx ray tracing and the reflection of AMD's 7 joust spears across its own armor and its and Intel's own joust spear
Kek

His own video speculates navi is another same old shit gcn generation on 7. Pretty much zero chance its some double performance leap if thats the case. Sony will hope you get hyped about 7 teraflop wow! marketing and you forget the one x has existed for 2 years at that point and cost $250.

>Vega is newer than a 1080 and a larger chip than a titan x,
But on the Vega 56 only 87.5 percent of the shader and texture mapping units work. If I multiply the die size by 0.875 I get 425.25mm :^)

>The amd x90 series used to cost $5-700 bucks
Vega 64/56 are this gens current x90 and x80 (or xx70 and xx50) cards before they made their lineup confusing.

Except the 3870 & 4870 were better than the Nvidia counterpart. Aegia PhysX ran better on AMD cards. Nvidia sold more. 5870 same thing. 6870 and 7870 AGAIN same thing. R9 290X, SAME FUCKING THING. 390X same fucking thing.

Nvidia sold better despite the competition having better GPUs. The True Scottsman has a 30 minute video detailing all that, and showed what happened with the market.

>blaming the consumers

AMD didn't blame the consumers, AMD stopped giving a shit when consumers wouldn't pick the better product in a market because of Nvidia's marketing strategy. AMD played fair, Nvidia played dirty; Nvidia won. That's why AMD is not in the high end anymore. Finally, GPU margins on the high & ultra end are complete shit. Pouring all your R&D money into that is a waste.

The real money now is Enterpise, low/mid-range, and consoles. Everything else is lip service. If you seriously believe that 1080/Ti tiers of GPU is where all the money is made, you're fucking delusional.

The core configs didn't change they just couldn't add anymore when they started getting slower than cudas 1:1. Not that that's a problem since they still can't make a 4096 that isn't a power sucking housefire.

The ISA, you dumbfuck.
Tone down the stupid.

>Except the 3870 & 4870 were better than the Nvidia counterpart. Aegia PhysX ran better on AMD cards. Nvidia sold more. 5870 same thing. 6870 and 7870 AGAIN same thing. R9 290X, SAME FUCKING THING. 390X same fucking thing.
>Nvidia sold better despite the competition having better GPUs. The True Scottsman has a 30 minute video detailing all that, and showed what happened with the market.
Even if AMD doesn't blame the consumers, I do. This is the future THEY chose by falling for marketing memes. I seriously hope neither AMD nor Intel will ever cater to gamer shits ever again just so that nvidia can fuck them in the ass forever.