AMD GPU SUPERPOWER 2020

>Faster than 2080
>250$
>150W
>STOCK
>7nm
How good will this OC?

Attached: fg.png (1920x1080, 756K)

Other urls found in this thread:

m.youtube.com/watch?v=ReYUJXHqESk
twitter.com/SFWRedditGifs

Nah this isnt gonna happen. With the recent position of the 590 I bet navi is going to be a vega replacement and polaris maintains the low end again.


However suppose that navi is super duper efficient, how did amd pull this off without redesigning the entire arch? I know sony had input but I dont buy it.

i have the 1080ti, call me when there's a 300 usd card that's at least 50% faster

Attached: hmm.jpg (768x768, 67K)

well, the thing is if they just die shrink vega, it would bring the chip size down to 180mm-250mm and chip size matters most when it comes to price, then the 7nm node will allow upto 30% more performance, its entirely possible to get the numbers in the leak. nothing is shocking, the only reason it looks to good to be true is nvidia keeps charging more and more each gen, like it or not, the 1080 was a mid range die size sold at high end prices.

Well 7nm is pretty much directly half of 14 so vega goes from 486 to 243mm. So slightly more expensive than polaris, cooler, and way better performance sure.

But getting +15% performance at 150w is completely unreasonable. Vega at higher clocks cut in half is already 170w, and 7nm is not even close to twice as efficient.

I don't buy it, I bet navi is gonna be pretty good but I think its going to be very gaming oriented design. There is literally no other reason sony would want to help design the chip unless they specifically wanted better gaming performance. There would literally be no other reason for them to collaborate.

m.youtube.com/watch?v=ReYUJXHqESk

Let me save you some time: High CU count GCN based GPUs are bottlenecked at the front end of their rendering pipelines due to being limited to processing 4 triangles per clock. This limitation hampers them severely in gaming, which is typically triangle heavy.

If AMD finally unfucked GCN's front end design, the new GPUs will be amazing, if not, they will basically be a die shrink of Vega with performance improved by however much clocks improved.

This is and will be true of any 48+ CU GCN based card in gaming until and unless they revise the front end to handle more than 4 triangles per clock.

>look guys the site that was literally spamming a few weeks ago with fake leaks in order to get visits and was proven to be literally spreading lies now comes again with this totally legit """"""""""""leaks"""""""""""""" carefully in company of the website's logo! this must be legit!

Attached: becca_face_unimpressed1a.jpg (539x558, 90K)

Delusion.

Yup...

I'm using two 1080s. Won't have a reason to change until like, Tes7 or something in 2040 at this point.

Tfw been crosfiring two vapor-x 290xs since they came out like 5 years ago and still not even tempted to change

fucking based
still rocking my 980

I'm running a single oc'ed 7950(a 280) and it's completely sufficient for me

fuck you i got cucked hard by nvidia and their 770

>GCN
GameCube?
What?

>adm radon 3069
>$1
>10029TB GDDR2000LP
> -10W TDP
se goyes it truv becas gren txt

Attached: 1529956771737.png (1000x976, 467K)

>vnidia rtx10^10^10+90
>$0.02
>99999YB WSXUGDDR10000
>actually generates enough power to run a small city

Attached: 1542144857034.jpg (552x548, 43K)

>tfw bought 2080 a month ago
someone kill me

Attached: 1508364938390.png (434x245, 189K)

OOF

Attached: oof.jpg (840x815, 251K)

What resolution do you all play at with a 1080 / Ti? I would consider it a waste of money if you didn't play at 1440p 144hz or 4K 60hz.

1080 can hardly run 1080p 144hz at frames above 100
1080ti cant even TOUCH 1440p 144hz, you would need sli 2080tis for that minumum sadly

based retard poster

>intel will soon launch their discrete gpu line
>tfw Raja is the last nail in intel's coffin
the future looks bright Jow Forums bros

>not Jow Forumsoyim
You dun goofed

Mail it to me before

You need to lurk more to learn some things before you post again.

But the GameCube was amazing man. The quality and tech!

>1440p 144hz
>you would need sli 2080tis for that minumum sadly

Attached: 1541246183696.jpg (480x247, 12K)

Attached: amd waiting evolved.jpg (1449x1229, 217K)

can't wait for more housefires

> speculations
> How good will this OC?
Kill yourself.

Based 44CU Hawaii.

See this. Just thought I’d show you 1080Ti master race lads

Actually 1080 and 1080Ti can easily do 144hz at 1080P/1440P. If you aren't a retard and use "Ultra" settings for everything.
Protip: Ultra settings is really just High settings minus optimization and smart LOD. That's why there's almost no difference between High and Ultra in terms of image quality. Ultra settings is pure bragging rights meme used by GPU companies to push those frequent high-end GPU sales by idiots who don't know any better.

>7nm is pretty much directly half of 14 so vega goes from 486 to 243mm
That's not how that works
You're applying a magnitude reduction in 1 dimension to a 2 dimensional area measurement.
It's like saying you reduced a 10x10 square to 5x5 so that's half the area.
The fabs fudge the numbers here and there so you shouldn't really go by the rated process name anyway, TSMC released some numbers comparing their 7nm to their own previous nodes though showing a pretty massive area reduction from their 16nm. TSMC's 16nm is pretty much the same as GloFo's 14nm as Vega 10 has 12.5 billion transistors in 487mm2 and TSMC's GP102 has 11.8 billion in 471mm2 which pans out to 25.7million/mm2 for GloFo vs 25.05million/mm2 for TSMC

Attached: ;jeg).png (692x275, 23K)

>2019
>no hardware ray tracing
nope

Current consoles have been out for like 5 years and PCs with the same performance are still much more expensive. What went so wrong?

hope this is true.
I have a 1060 right now and it sucks for 1440p.

>200019
>buying first gen meme ever
OH NO NO NO NO NO

Seems to works surprisingly well for a 1.0 product in all respects (hardware and software). For what it's good for, whole-scene reflections, it's a big step up over what's available now. The neural-network upscaling is also neato and working shockingly well.

I use my 1080ti at 1440p144hz and it's comfy. Want to get another 1440p screen in the future.

>1080ti

Enjoying your raytracing oh wait

Lmao.

>2020 still no gay tracing games.
>2020 nvidia release new gpus much faster than the 20xx. Claiming these new cards are the real gay tracing deal.
>2020 all rtx 20xx cucks call suicide hotlines.

1080×1200 each eye @ 90hz

Mining + some extra NV jewing thanks to AMD having a shitty poo incharge of GPU's

I own AMD hardware you dummy, it's just these """""""""""""""""""leaks"""""""""""""""""""" that are so obviously website ads that piss me off. Mods won't do shit either. WTF

yes goy no kvetching, buy nvidia today

somebody call 911

did you even read my reply?

same.

>>Faster than 2080
The image literally says it's an RX 2070 competitor

If AMD releases 7nm cards in Q2 2019 will nvidia really wait that long for a new series?

Yeah because new amd cards won't beat high end nvidia.

And remember that 2080ti is GP104 so it's a mid range card.

They still have 102 and 100 which are the most expensive cards that they can just shift down in price if amd beats 2080ti.

Nvidia is like 2-3 years ahead of amd in GPU power. And they sure as Shit know it.

>Nvidia is like 2-3 years ahead of amd in GPU power. And they sure as Shit know it.

How did things get this bad? Did AMD move all their design resources to the CPU side to catch up with Intel?

>Did AMD move all their design resources to the CPU side to catch up with Intel?
no. many say that AMD/mommy Su shifted most of the RTG to Navi for Sony's PS5, that's why Vega was kinda a turd to begin with. newest drivers made the best out of the hardware but still a turd at performance/watt

So when is the next GPU family (that'll presumably be in common with the PS5) coming?

Is Microsoft sticking with AMD?

>Is Microsoft sticking with AMD?
100%

>(that'll presumably be in common with the PS5)
David Kanter, some analyst, predicted PS5 to drop 2020 with announcement at summer 2019

>So when is the next GPU family coming?
i sure do hope this year with "small" Navi (40CU). we only got AMDoredTV (unconfirmed) leaks, despite that we don't know much. what we DO KNOW is that AMD trademarked Vega 2 / Vega II recently.

I'm at 3440x1440 @75hz.
Works great for Skyrim and fallout.
I just wish Todd Howard would get his shit together and purge the zenimax filth from his game studio.

>vega FE has a peak of 11/triangles

Attached: 08d8b25b5f9e3559889499c77ab21721f701ef5e.jpg (650x366, 55K)

where is this from? never seen that before

Exactly this. It's too based.

how does one divide 11 through triangles?

1024*768 csgo on a 144hz 1440p monitor, I get about 400fps

its pretty well known that primitive shaders literally shits on nvidia when it comes to workstation level of workload

but i doubt amd has the manpower and the money to do the same on games hence why they just dropped the software support for it early on

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9SLzgvNjkzMDQ0L29yaWdpbmFsLzAzLUNyZW8tU2NvcmUucG5n.png (711x533, 37K)

Just Curious how long will my GTX 1080 somewhat overclocked last for 1080p at 144hz?

Raja never got the software development team to even *begin* trying to implement primitive shaders for gaming. He made up the idea it would be implemented automatically in the drivers as a handwave to cover up the fact he spent three years stroking his dick to accomplish the sum total of overclocking Fury.

The driver team was so goddamn far behind that Vega hard-launched on pre-alpha drivers that had been cobbled together based on Fury drivers in the 3 months prior to hardlaunch.

What a fucking joke. So glad they got rid of Raja.

>Nvidia is like 2-3 years ahead of amd in GPU power.
Only because GCN is front-end bottlenecked in gaming workloads. When AMD finally dedicates the resources necessary to revise GCN's front end, AMD will instantly pick up ~30% performance from fixing GCN, plus whatever they get from a node shrink to 7nm. Would make them competitive instantly, outside of gaytracing.

Yeah it's GCN holding amd back at the moment. Everything else kn the GPU is just fine.

So hopefully navi solves some of this. I would love seeing 2080ti drop to the 399$ price points where they belong.

It all depends on what you mean by "last"
There are already games where you can't use a 1080 for 1080p144Hz at high/very high settings, but if you don't care about the kinds of games that get poorly optimised like that and can handle lowering settings these sorts of games should still be rare until a year or 2 after nvidia turns 1080 performance into midrange.

I'd fucking love to see that happen because then I'd actually be able to fully utilize my Freesync monitor, but until then I've got my 1080 Ti.

>Faster than 2080
>Clearly says on the image that it competes with the 1080 and 2070
I'm excited for Navi too but you're a fucking retard.

Attached: 1416539066368.jpg (1199x1200, 180K)