NAVI

Navi confirmed for 2070 tier performance.

youtube.com/watch?v=ckJIy0L7LHY

Attached: Rumour_+AMD+Radeon+RX+Navi+to+Launch+July+7+-+Flagship+GPU+on+Par+With+RTX+2080.jpg (596x224, 38K)

Other urls found in this thread:

github.com/llvm-mirror/llvm/commit/f43d543c4508e6857afadcf9f2381b4ea3380939?at=anzwix
youtube.com/watch?v=5eMl4j_lkTg&feature=youtu.be&t=401
twitter.com/NSFWRedditGif

>confirmed
>analysis of '''leak'''
make up your mind

Keep watching.

Is navi Raja's work?

Dog shit then. You can already buy a 1080 or Vega 56 for 250 and get that performance. It's dead on arrival if it doesn't match the 1080 Ti at least.

The full fat navi might match a 1080 ti for a lot less, But if you're expecting a 2080 ti killer, I think you're going to be disappointed.

He says it might be a dev board. So nothing too special other than GDDR6

Who else waiting for 2600X + Navi?

If you're on a budget, then sure why not. You can get that type of performance now but at a price.

You mean 3600X + Navi

Yeah pretty sure amd will push out something 20-30% weaker than nvidias top offering.

repeat after me 2070 level performance

me. I have a feeling 3000 series will be bad for gaming due to chiplet design

Wasn't this what literally everyone was expecting? 1070/2070 performance for less money and power

the realistic people, yes.

If the price is right it would be a huge win, 2070's are pretty powerful.

Why is AMD always playing catch-up with GPUs ever since the first Titan?

Diversity my friend

1080 performance for $300

They might make it 5-10% faster than a 2070 just to pressure nVidia

navi?
is this another lain thread?

Most evidence so far.

If the PCB componentry is anything close to what is speculated here, it's going to HAVE to run a blower to come even close to the expected $250.
That's just a somewhat educated guess, but 70A power stages are really really good stuff for a mid-range card. Or maybe the die will have to be Polaris-size or smaller to make the price point work.

>hurr first gen isn't killing nvidiots ;_;

Gen 2 navi will kill nvidia and their shills once and for all.

It's going to sound like a jet engine......I'm already turned off by this and no high performing card now days is going to have a blower on it at 300 watts.....

Jesus dude.....

I personally don't care as long as the PCB is great. Coolers are a modular part and ridiculously good aftermarket ones that can be reused across multiple cards are $60.

Tldw?

256-bit bus and two 8-pins mean it'll likely be 2070-80 territory for 250-300W. And the mounting holes correspond to a blower cooler. Die size is impossible to tell for now.

They wish but they can't. They are too behind and I think it will be at best euqal to 2070 amd and will cost $400.

>the HD7970 is great, wait for the new drivers and it will age like fine wine just wait!
>the R9 290 is great, 95°C is totally good. Wait for the new drivers, it will age like fine wine just wait!
>the R9 Fury X and R9 Fury are great you can even unlock some cores in the Fury to get Fury X performance. They will age like fine wine just wait!
>Vega is great, don't worry about the hotspot temperature but undervolt the card and run multiple tests to make it sure it doesn't run too hot and underclocks. Also wait for the new drivers. It will age like fine wine just wait!
>Radeon VII isn't even a gaming card! But wait for the drivers for a fair comparison, just wait!

>Navi

By the release of Vega I was already done with the train ride. Fuck off.

It's like you hate fun.

Dude fuck off, gen 1 navi will only be the introduction of the new architecture it will obviously not complete with the high end stuff, that's why you wait for gen 2 navi.

Ty

>Navi confirmed for 2070 tier performance.
You mean exactly what AdoredTV claimed months ago? Why are people shocked that a chip blatantly designed to be a mid range chip gives (what amounts to) next gen mid range performance?

The VII will remain AMD's top performer for a while yet.

I lost money selling those cards to get their Nvidia counterpart.
GTX 980 being faster than Fury X or 290 heating my room in October is NOT fun

I'll sure be "waiting"

>GTX 980 being faster than Fury X

Attached: 1446667091224.png (196x171, 8K)

Yeah OCed 980 faster in DX11 never happened. Sure.

>gen 1 navi will only be the introduction of the new architecture
navi is still gcn

>Why are people shocked that a chip blatantly designed to be a mid range chip gives (what amounts to) next gen mid range performance?
because turdring was a disappointment

>buying AMD GPU in 2019 when they're still using GCN
Might as well wait until 2020.

Which could make the bus width slightly worrying given the expectations, assuming nothing was done to alleviate GCN's limitations on the memory side.

Wow that'll be pretty sad. We've had this tier of performance since 2016 and it cost maybe $800 then. 3 years later and the same performance cost $400. Price/perf is moving at a snail's pace.

I'm not really surprised at this since we already know that nothing Navi is bringing will pass the Radeon VII, which will still be AMD's top gpu. If only AMD could get their gpu lineups on par with their cpu's.

Is Pascal one of the best series of cards ever? By the time these are out it'll be almost 4 years later and new hardware is coming out that just competes with it at lower price.

It's reasonable to expect whatever is going to come after GCN will have been designed with Zen in mind. I don't think they can keep their graphics department underfunded now that Google Stadia on top of the existing semi-custom deals relies on it.

We already know that since it's the same shitty GCN.

GCN is trash because it does 4 triangles/clock. Nvidia does 6.

yes

Nvidia shills survived the GTX 4XX/5XX vs HD 5XXX/6XXX don't think you're killing shit especially not coming from that far behind and with ampere potentially delivering something as powerful as a 2080ti with a TDP in the ballpark of 150w

Dude 7970 was the last card AMD made which was better than Nvidia in every way.
Tahiti architecture was gold and it still runs every game at med-high settings meanwhile 670 aged like shit.
Even at highest end 7990 vs 690 was a close competition and after that Nvidia made Titan and completely destroyed AMD.

When that happens really dpeends on when they could have increased funding to the rtg as we all know a new arch takes four years if it's fresh, I doubt it's that fresh as there's still a lot of IP in gcn and it just needs a big overhaul instead of going really new. Just before polaris announcement the stock was ~$2 and I expect getting any funding anywhere was a real difficulty and we have raja's complaints that vega resources were shifted forwards to navi and sony, so it looks like navi is another safe bet on slightly better gcn but vega20 was the 7nm pipecleaner and hopefully navi is optimised for the process at the very least.

>Titan and completely destroyed AMD.
Except everyone knows the 290x as the titan killer. So much so Nvidia had to release the full die in the form of the titan black which they were originally only going to keep for the quadros.

>2 8-pin power connectors

OH NO NO NO NO NO NO NO NO NO NO NO

AHAHAHAHAHAHAHAHAHAHAHAHAHAHA

AYYMD HOUSEFIRES

navi is the last iteration of gcn uarch you moron we know this since mid 2018

how is a 8 core ccx worse for gaming than two 4 core ccxs? cache is also doubled so ddr latency shouldnt matter.

None of this is rocket science.
They either give you Vega performance at 50% lower power consumption.
OR they clock the same chip 30% higher and give you 30% more performance than Vega at the same power consumption.

Same memory bandwidth as Vega is plausible since there is supposed to be improved color compression in Navi.

Attached: 172884-mi60-vs-mi25-chart-1260x709_0.jpg (992x558, 82K)

>Confirmed
>It's just muh asshole said so

Attached: Laughing Whore.jpg (762x900, 161K)

They better have primitive shader support on Day 1 too to save memory bandwidth and variable rate shading. There's just so much shit that Nvidia is doing to not be bandwidth constrained that AMD is not doing right now on Vega 7nm, which has been the downfall of where GCN is right now.

Why is GCN still a thing?

I wish they take a middle ground approach, 75% power consumption with +15% perf
Those who want that last +15% can choose to add 30% power limit and OC their cards.

'cuz for 1+1 it is lightning fast. Its not so suited for rendering big anime tiddies.

>ampere
>an in-between because we have no competition architecture will be that good
sure thing buddy

r9 290 was amazing price/perf as well but AMD had to fuck up by having the worst reference cooler. And rightfully so, it started the housefire/leaf blower meme

because it was that good to begin wtih

>what is underclocking

I know it's great for compute, but after all this time it seems like the wrong approach at the wrong time. Almost feels like Bulldozer, as in where they scaled in cores instead of IPC.

It doesn't matter what you have to manually do. If it's loud and hot out of the box, reviewers and dumb gaymers would meme the hell out of it

I wouldn't say started it, but yes it was the most recent.

GCN is on it's way out and this is it's last hurrah.

Maybe they get another couple % improvement from the architecture but they are not going to magically find tons of spare performance in GCN one generation before replacing it.

ever since the Fury the cards have an insane amount of voltage and high clocks out of the box, just to compete
thank god for that

Its not the wrong approach for servers and datacentres - its why both AMD and Nvidia have number crunching monsterws for that market. The difference is Nvidia had the funs to split their gpu architectures up and make one focused on pushing pixels - maxwell is the first generation of this which is why it was such a leap. AMD hasn't been able to do that so to remain competitive (which they do at many levels despite Jow Forums's belief otherwise) they just hotclock their server chips and brute force performance.

GCN is effectively a V12 of gpu architectures - the more you slam power and clocks into it the faster it gets but boy does it drink fuel.

rtg just hasn't had the funding to make the improvements it needs. See nvidia and they're still on basically fermi but each generation they fix something, taking out the hot hardware schedulers for kepler, taking out any fp64 hardware and improving resource allocation/efficiency in maxwell, pascal had some tiny back end stuff but was mainly porting to 16nm and turing seems less like gaming cards as they're focused on compute and ai but re-added decent scheduling support and volta being fp64 monsters. AMD gcn on the other hand has only ever been mnor changes from taihiti, hawaii, and fry cutting fp64, polaris being a 14nm port for three years and vega being undersupported on the software side or broken hardware that was never fixed in a respin. The numbers really show in the comparison of the amount of dies designed and released. AMD launches about two a year and respins them polaris10-20-30 while nvidia managed to launch a full range on 16nm quickly and has pretty much done the same on 12nm with 800mm parts down to whatever the 1650 is sized at.

Solid post user good shit

>NAVI CONFIRMED

youtube tittle
>BASELESS SPECULATION

Yeah but at this point, they need another Zen-like arch ground up redesign to compete in graphics and I don't know if they can pull it off.

github.com/llvm-mirror/llvm/commit/f43d543c4508e6857afadcf9f2381b4ea3380939?at=anzwix

> EF_AMDGPU_MACH_AMDGCN_LAST = EF_AMDGPU_MACH_AMDGCN_GFX1010

>GCN

INTO THE TRASH IT GOES

>pascal had some tiny back end stuff but was mainly porting to 16nm
Pascal itself was a major revision as well. They focused on patching up maxwell's poor async compute

>laughs in thermie

Blower in 2019? Seriously hope AMD doesn't do it. The sight of a blower cooler gives me disgust. With two eight pins, even worst.
Pretty much agree with Builzoid that people will meme it as hot and loud and then it will the end of it.

thats why it flopped so fucking hard in world war z?
nvidia simply cant do async at any meangingfull level

No one cares about async, it's irrelevant

AYYMD IS SLOWEST GPU WITH HIGHEST POWER CONSUMPTION, FACT

when they will release these cards ?

In Summer

Async compute is quite fascinating when you compare both AMD's and Nvidia's approach to it to attain throughput. A major issue is not many developers really tune their command streaming for GCN's strengths - it prefers a constant stream of instructions and the internal scheduler will handle it rather well (the enormous shader array here works as intended) whereas Nvidia prefers one big fuckoff batch of commands and their software scheduler will distribute it effectively. Anything tuned in Nvidia's favour will massively overwhelm GCN's scheduler and cause entire pipeline stalls.

In esscene Nvidia is fire and forget where as AMD requires a constant stream.

>AMD is an autistic lainfag weeb company

based ayymd poster always BTFOing AMDrones
Intel/Nvidia4lyfe

vega and navi is like bulldozer and piledriver, navi is going to be a bit better but will still suck major dicks and we will need to wait 5 years for amd to make a proper gpu

It was a huge step up from maxwell's async compute but still falls short of gcn until turing came about.

topkek, by then intel would have bought rtg and release an actual gaymen gpu

It's so the PS5 will be cheap to make and have easy backwards compatibility. Remember Radeon doesn't make cards for desktop gpus first, they make custom architectures for a client and then pretend 3 months later that they have a high end gaming gpu.

This will likely be Polaris 2.0. It will raise the bar to mitigate Nvidia jewry, but ultimately will not scale up to a big die well. The RX 590 was consuming ~240 W full load and RX 480 was ~150 W at launch, and like Buildzoid said, they're not launching a 3 slot, triple fan card with a 300 W tdp. 7nm should be slightly better clocks than 12/14, but it's still GCN so don't expect miracles.

Man, I've got a 1080 ti, but I've run radeon cards before. Anyone care to explain why people shill for graphics cards harder than console peasants shill for their brick of choice? Do some of you literally work for nvidia or amd?

Attached: 1555214969248.jpg (819x1428, 239K)

Are there any rumors about the number of compute units in the PS5, Navi 10 or Navi 20?

sunk-cost fallacy, these GPUs cost even more than consoles so people shill even harder to justify their purchase

>gen 1 navi will only be the introduction of the new architecture
It's only going to be a "fixed" version of Vega meaning it will be mediocre unless priced right.

>wait for vega
>wait for navi
>wait for 2nd gen navi
as always the story continues

>pic
First one is talking the true. Gotta a source for what book is that one and the dream one?

Even their 3 fan radeon 7 is loud. Just compare it to the 2080.
youtube.com/watch?v=5eMl4j_lkTg&feature=youtu.be&t=401

No, Nvidia's about to get BTFO, and it's their own fault for trying to force their implementations instead of adopting better standards. It's no doubt their brass is markedly corrupt given their scummy business practices. It's only a matter of time before they run it into the groumd trying to satisfy their unsustainable growth for their shareholders.