NAVI WHEN BROS? IM TIRED OF WAITING

NAVI WHEN BROS? IM TIRED OF WAITING

Attached: 04323.png (342x356, 136K)

Just wait

Navi will be good but Navi 2 will truly end Nvidia and Intel, just be patient sanjew

You mean Navi 3+

waiting for what? more midrange housefires at less than ideal prices because GCN dies are fuckhueg?

CES launch. 1080 performance for 200 bux. Just you... fucking wait.

I love how people are pretending this is something amazing.
Please understand how fucking old the 1080 is.

I'm tired of waiting, it's always wait for the next iteration of an AMD product instead of buying their shit now. When will this meme end???

I agree. Also, absolute high end GPUs shouldn't exceed 500$.

If you don't game at 4K60fps, get yourself a Vega 64 from newegg at 399$ and 3 free games. Stop being retarded, AMD always delivered a stage below the highest end.

>early to mid 2016 performance in early to mid 2019
Yes, very exciting. I wish AMD had a product for me. I've had GTX 1080 performance for ~2.5 years, I want something much faster after such a long time.

Same.
Gpu tech has basically plateaued Tbqh we wont see anything until 2.5-3d stacking and mcm that's a long ways off and requires ground up architecture and driver redesign from scratch

>CES launch
Announcement. I expect CES announcement. I'll eat my shorts if it gets a CES launch. I expect a 50/50 chance Adored's leak is true, but the CES launch is completely out of the question. They shouldn't be ready just yet. I expect a May launch.

They did already a Ryzen 3 contest, so the possibility for a CES release is kinda high

It was taken down, as soon as AMD became aware of it.

The adoredtv leaks were confirmed fake by several people

Nobody confirmed it fake. People just disagreed to their reality/possibility

It hasn't plateaued, a 2080 Ti is nearly 2x faster in traditional rendering compared to a GTX 1080. I (and I assume you too) just don't want to spend the asking price.

Nobody wants to spend 2x for a beta test gpu.
It's over 1k usd ffs and everything under it is useless

Sure, that doesn't mean tech has stagnated. The performance is right there, people don't want to spend the money for it but the tech obviously exists.

Radeon has much better support on Linux. I’m never buying Nvidia again.

and yet a 2070 is still $500

>Gpu tech has basically plateaued
No it hasn't. AMD's GCN architecture is front end bottlenecked buy being limited to four triangles per clock, which bottlenecks the shit out of high CU count GCN GPUs. If AMD was able to revise that front end and eliminate the bottleneck, there is around 30% of a performance gain there for the taking, non top of clock gains from moving to 7nm.

Nvidia has hit the upper limit of chip size on their existing node, but could probably start going up in die size again at 7nm once they can use it.

it's only like 20-30% faster than a 1080TI which is pretty garbage for 100% more expensive

>tfw Navi is still just a minor evolution of GCN, like Excavator before Zen.
You're not just waiting for Navi, you need to wait for POST-navi.

I know, but price and marketing strategy have nothing to do with technology plateauing. In 2013 I bought a 290X, which was pretty much the fastest card on the market at the time. In 2016 I bought a GTX 1080, once again the fastest on the market and about 2x as fast as the 290X. Now in 2018 I could buy a 2080 Ti, which is yet again ~2x faster compared to 2016's GTX 1080. Performance doubled in 2-3 years in both cases, GPU tech has not plateaued at all, the performance increase is still there.

The problem is price, which is entirely unrelated to this issue. That's about market strategy and lack of competition in the high-end segment permitting NVIDIA to overprice the shit out of their products.

you have to be some sort of mongoloid

>Now in 2018 I could buy a 2080 Ti, which is yet again ~2x faster compared to 2016's GTX 1080
Is it?

Yeah, it pretty much is. Not quite 2x on average but very close.

Attached: relative-performance_3840-2160.png (500x930, 51K)

May as this guy
said.

So 1080 performance for 200 dollars instead of 500 dollars isn't a good thing?

>start going up in die size again at 7nm once they can use it.
I really fucking doubt it's feasible to make fuck huge dies on 7nm yet. 2080ti has almost 2 times bigger die than vega64, imagine how much will it cost to make full gv100/gt100 on 7nm.

>4k meme resolution
meanwhile in a resolution 99.99% of people use

>Not quite 2x on average but very close.
>it's not even 50% better
Did you fail primary school math?

Attached: relative-performance_1920-1080.png (500x930, 49K)

No, but you certainly did.

Indeed, we should evaluate graphics card performance at 1080p so games hit CPU bottlenecks, that will surely produce correct results which reflect GPU performance.

>No, but you certainly did.
Nigger are you unironically telling me

“Just wait.” AMD slogan since Fx. They were only great when the 7970 was released by giving more vram than Nvidia and phenom when it was competition towards intel. AMD is cuck tier and what’s cucking you guys is time itself

Attached: F394E299-FC7A-4258-894B-F5B73FB13414.gif (276x260, 1.08M)

>don't evaluate things at resolutions that people actually use
i bet you are the same retard that thinks 240p cpu "benchmarks" are valid

Attached: 1541158976326.jpg (800x420, 47K)

1k for barely a 30% uplift from a 1080ti/64wc isn't worth a 2x price increase
Even if turing dropped back down to sane price levels on 7nm or 12nm mature nodes it's value is dubious because of how slow it is at actually ray tracing in games.
Rtx is slow as balls at reflections shadows ao lighting and God knows what else.
Only thing these glorified tech demo cards are actually good at is dxr vulkan rt accelerated workstation programs like ps Vegas blender renderman solid works etc that will lean heavily on their compute power but for now they don't even support the 6 month old cards so atm yes they are useless especially for gaming
The only games they seem half decent in is async compute heavily shaded stuff which is rare atm especially since most games ironically optimised for gcn console hardware first and nvidia pascal... Not turing
This will change in the next couple of years obviously but right now these cards suck I'm just gonna wait until 7nm drops and nvidia has some competition from amd again.
Pascal was good but no way in fuck I'm going anywhere near rtx while it's so immature the cards offer no value apart from that the 2060 2070 are shit and the 2080 is within a stones throw of a heavily overclocked 1080ti
Not to mention anything below a 2080ti is useless at Ray traced effects in games unless you wunna put up with 1080p 30-60fps fuck that no thanks
It's also bottlenecked by dx11 and 12 Apis and their implementations in games
Vulkan mantle and consoles got around it as well as some good dx11/12 ports but it's rare.
It won't be feasible as we've seen now the yields are so bad on turing even rtx titans and 2080tis are shipping bricked
It's fermi all over again
Nice shill here's your (you)

Nvidia either side of pascal has made hot garbage
Intel has made absolute shit since Lake
So what it took amd 5 years to unfuck bulldozer and make Zen still better than Intel using the same arch unchanged for a decade

You spent all that time writing that and no one cares, you must be fun at parties or going out. Oh wait, you’re a neet virgin still. Bye

Still got a (you)

Attached: pepetodd1105006.jpg (459x271, 67K)

they were confirmed fake by no one, benchmarking websites saying that they disbelieve TSMC and AMD projections is not proof

Attached: 000.png (540x641, 266K)

y y.. yes they are.
armchair engineers from websites that didn't even know what a VRM was until 6 months ago are surely more knowledgeable than those actually building the hardware!