7nm gpu

>7nm gpu
FUUUUUUUUUUUUCK YASSAMD

Attached: file.png (1920x1200, 1.21M)

Other urls found in this thread:

comma.guide/vocative-comma/
twitter.com/AnonBabble

nm gpu
why does that matter?

Moar Transitor

I hope they pull it off but i really really doubt it will be everything they say.

A chip of X size always costs Y price.
The more shit you can cram in the same size, the more performance you can put in the Y price.

And you can shove a lot more shit on the same space with 7nm if compared to the previous 14nm.

7nm gaming gpus coming she said it at the end

Yeah, but it's locked to enterprise right now. They did mention on working for the gaymers, so maybe H2 might be possible.
All I want is SR-IOV though.
>mfw it's never coming to consumers

Attached: Feel and Sad - bar.jpg (1280x842, 108K)

>>mfw it's never coming to consumers
thank nvidia for that

This is the best analogy I've ever read on the subject. Basically this.

no, the cost of silicon will go down over time as they recoup the costs of building the fabs.

there will always be a minimum cost, but we aren't at minimum costs yet, 7nm will likely be another long 28nm node

>7nm will likely be another long 28nm node
Aren't they working on 5nm already?

Yes, there are those details etc..But the basic is "cram more shit into the same space, get more performance per price"

in 2019.
as Navi.

Not in 2018 at all.
No one is releasing gaming GPUs in 2018.

i'm literally in tears bros. what the fuck do we do now. what the FUCK do we do???

Attached: 1526233971884.png (676x772, 56K)

Buy which fits you best. As always.

It's over bros... As Americans we should unite, maybe we can do something... ANYTHING to help nvidia.

kys is the only option.

You invested in the wrong team.

Don't worry, it's still going to run hotter and worse than an nvidia card from 3 years ago.

7nm, what else? you're not intel that is locked to your own fabs.

Attached: IMG_20180606_012921.jpg (550x428, 19K)

It will be fucking dogshit, AMD (GPU or CPU) hasn't released a single good product since Tahiti.

Attached: 1511317434652.jpg (667x1000, 72K)

Attached: 1528192649937.jpg (679x758, 54K)

You're not funny.

Attached: 1527532830730.png (633x900, 162K)

suck jensen's dick

THE MORE YOU BUY THE MORE YOU SAVE

Who actually cares about GPUs? Besides gaming, what do any of you do with your GPU?

call me when this happens.

Attached: file.png (1338x1267, 250K)

Next year probably. They already confirmed using IF to have better communication between these Vega. It's only a matter of time.

>tfw there will be literal 225% performance improvements out of 1080ti and Vega 64
what a time to be alive.

Wow she, actually looks sorta cute when she's not wearing a blazer. She's got nice cuddly arms.

it wont anytime soon ,the cooling would be a nightmare. it only useful in mobile using low clock

We may actually have a GPU that'll run respectable fps on 4K by the decade's end...

At 120hz+ no less

you missed the most crucial part

they said they used IF to connect the cores
this shit is mcm

No, it's like NVLink.

>i'm literally in tears bros.
>what the fuck do we do now
Read comma.guide/vocative-comma/

Attached: 1521432309199.jpg (1280x720, 55K)

>2018
>zen 2, whole new architecture at 7nm
>double the performance in gpu's
Are ya ready lads?

Attached: 118709112137.png (217x168, 7K)

Oh wait, this is 2018. 2019 then.

Attached: 138791368130.png (698x658, 72K)

comma.guide/vocative-comma/

nvlink connects cards amd specifically said CORES

It's same shit, you can abstract the separate GPUs in a fully connected NVLink mesh, too.

they already had IF connecting physical cards
they literally talked about cores
i just cant dumb it down more than this

7nm mcm will be noice 4k 250fps here we come

Maybe H1 2019 inshallah.

>PC show on E3
>Lisa Su on stage with the Lisa Su face

What if consoles get this technology? Wouldn't that be he death of PC gaming because of console optimization and lower price?

PC is not about performance, its about freedom.
You can't play truxton on a PS4.
Also Vulkan pretty much bought console optimization for the PC.

>console optimization
lol

Low latency, close to metal access to the video hardware.
It was a problem on PC and it is not anymore.
All console developers can do to get an edge is to analyze the games part by part and carefully balance the assets and effects to keep on the target 30 fps frame rate and squeeze a bit extra more that way, but the days of "i can do a trillion draw calls and PC can't lol" is over.

Just fucking make 7nm polaris with IF and stack 2/4 of them on a single card. Why is it so hard?

>tfw still have an fx 8350 and r9 280x
How much longer do I wait? If some big shit will happen soon, I'm not upgrading.

>vega now uses 600W
No thanks. The vega series is trash.

>more efficiency means the chip uses more power
Intel logic.

It won't be that more efficient

But if it's any more efficient it's still using less power? Are you retarded?

>freedom
still relying on gameworks instead of vulkan
still nvidia tries to make an abstract to shit on vulkan

freedom

>if it's any more efficient it's still using less power
Are you retarded?

What the fuck are you even talking about? Efficiency=lower power consumption.

Demonstrated that they pushed to the limit of silicon to improve it

via stacked processor dies are as old as the raspberry pi the issue is heat dissapation on the bottom wafer hence it's never worked on anything more power hungry than an arm.

tldr; has been possible for years still not practical fake news

A tech wonder that the Based mommy (Lisa Su) delivered to us.

Attached: images (25).jpg (210x247, 8K)

It wasn't practical and commercially viable,....

.... until now thanks to amd.


But where are the Intel's 10nm?

I can't find any article that has claimed that "and" (tsmc, gloflo) have solved the heat dissapation problem

Lol like why not just make cubes instead of chips xD?

Oyyyyyy

She's gotten so thiccc
Scaling and yields
Active backplate cooling

35% more performance and 2x the perf/watt.
So we're looking at better than 1080Ti performance at GTX 1070 power consumption?

Question is if they'll have a prosumer model again and how high those'll clock.

Attached: amd-unveils-worlds-first-gpu-7nm-computex-2018.jpg (620x465, 33K)

hard to cool it down
the same reason why they cannot just make bigger CPUs, it's not that silicon is expensive, it's because the bigger it is the more heat it produces because (at least in my understanding) of the longer travel of electrons and thus bigger resistance

>Question is if they'll have a prosumer model again
They already confirmed that this will be an enterprise card. You have to wait for Navi.

They confirmed that this particular model unveiled with 32GB HBM2 is for enterprise only, and there is not a Radeon version coming.

That does not confirm there won't be another Frontier Edition for the prosumer market.

Don't get why they wouldn't release a cut down version with 8/16GB's HBM2, finally compete with the 1080 Ti

Limited Capacity.
If AMD has Vega 20s they can't manage to sell they'll make them into RX or Frontier Edition.

Don't think it's worth it. You can't just package the same chip with different amount of ram, because of HBM they need to manufacture a whole different line of chips, and Vega is not selling that good for that. Unless the crypto shit is still going strong.

worksonmymachine.seal.jpg

Why don't they just stick gddr6 oc on there?

That will be happening with the next gen Nvidia

I guess because the structure of the chip can't use normal memory? It was designed in a way that the memory is on the chip. But would be interesting to see, even gddr5x would be a solid choice.

Hbm is shit for gaming

christ almighty
could you go back to whatever shithole you came from?

0/10

Fury sucked
Vega still sucks and 8gb is fuck all with 4k and 8k vr on the way

what the fuck does any of that have to do with hbm, you fucking /v/edditor

you can have double the gpu on the same area. so its pretty significant.
the only bad thing is that this confirms that there wont be a 12nm vega revision. The 7nm desktop part is way far.

Effective total bandwidth is what matters and gddr6 oc is faster then hbm2 ancient shit

after 3nm meaning 6 years.

the flip side is that working with increasingly shrinking components also becomes more difficult and R&D costs go up

and as it turns out engineers are more expensive than sand, which is why prices continue to creep upwards

buy a ryzen 2600x. its the best time to upgrade your cpu.
Dont buy a new gpu though. miners and ram fucked up the prices.

2021-2023 would be my guess

Amd will skip 5nm, the next node they are targeting is 3nm.

Ryzen 1600-1800x are better value and very cheap used I can't even sell mine
Then what 1nm carbon graphine?

Attached: 1527614073649.png (640x478, 431K)

By the time they have a 1080 Ti killer Nvidia will release the 1180 and then the 1180 Ti. AMD need to dump that shit and get Navi out there on 7nm node.

Probably, at 3nm they would pretty much at one of the smallest silicon based node possible. So they'd need to use graphene or other meme super technologies to get that sweet performance uplift/power savings.

Does this mean there will finally be a low watt AMD budget GPU that's fine for the occasional vidya or are they going to focus on $600 mining stuff again?

Navi won't do shit. Amd needs a new architecture to replace the outdated gcn garbage.

Then what quantum magik?

Well it has the potential to be very efficient. That said as long as mining is a big thing they will be sold out or very expensive.

smaller gpu dies are actually worse because they can't fit as many transistors/circuit thingies