AMD GREAT AGAIN

wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018
>x1.25 performance of top tier Turing RTX, but much smaller and cheaper
>meanwhile, Inturd's first GayPoo is going to be 16nm and literally no one wants nGreedia's PEDOFLOPPING Illuminati garbage

AMD saves gamers and professionals alike all over again!

Attached: HMMM.jpg (921x597, 329K)

Other urls found in this thread:

wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018
youtube.com/watch?v=Vj9_-fO46bQ&t=557s
pccasegear.com/products/36449/asus-geforce-gtx-1060-dual-fan-oc-6gb
twitter.com/NSFWRedditImage

I wait till I see benchmarks. AMD hasn't released anything good since the 7970.

>AMD hasn't released anything good since the 7970
7970 was garbage, what the fuck are you smoking you fucking retard. 270X was good, 280X was good, 380X was good, RX 480/580 is good.

As always, wait for benchmarks. Pre-release material is as reliable as alex jones

>intlel's first gpu will be on 16nm while everyone else will be on 7nm

Attached: 1494016573545.jpg (332x298, 21K)

low quality post

>7970 was garbage, what the fuck are you smoking you fucking retard. 270X was good, 280X was good, 380X was good
Wow. They're the same shit.
P.S 7870XT owner.

>Intlel's first GPU will be
>will be
Uh-uh-uh, kid.
You were clearly born in 2000s.

Attached: 1998 i740 Intel discrete GPU.jpg (700x452, 85K)

After Vega, we all have reasons to only believe it when we see it.

270X is a better, very polished version of HD 7870. It's not EXACTLY same, as it clocks higher and uses better tweaks/optimizations. Original HD 7870 was basically prototype, lousy garbage (entire 7xxx line was), R9 270X however is a very capable and solid card.

Attached: 5665465465465.png (1278x770, 569K)

>mommy has not stopped triggering corelets
and you neckbeards said having women in tech was a bad thing

>he doesn't blindly follow [Brand A], he must be a [Brand B] shill, there's no other explanation

1993 (゚ペ)

i love making fun of corelets, it's so easy to trigger them

The tech WAS invented by a woman, for fuck's sake.

>Vega 7nm
>AMD saves gamers
Nope
Vega 7nm will be a datacenter/professional GPU, no gayming
Navi will not be released for a while and focus on consoles
No one will save you

2300mhz core clock confirmed

|
|>
|
|
|

>wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018
what the fuck
I only just got my vega

2450mhz core clock confirmed

AMD make 10-12GB GDDR6 version for consumers.

Can 7nm Navi become good or will it be shit because it will still be a GCN chip?

>GCN
>shit
You're drunk on kool-aid, faggot. There was never anything wrong with GCN, it's an amazing architecture.

Bingo. My 290x from 2013 is still very capable. Dirt cheap when I bought it too

>7nm in 2018

Attached: 1499963016439.jpg (691x771, 112K)

But I thought GCN can't scale anymore and it's basically a dead end

That's not the point. He said GCN is shit in general, that is utter fucking lie.

>he is posting in a pro AMD thread and gets triggered by an image

280x is a 7970 you mong.

>currytech

ill believe it when i see it

Mommy does it again

Attached: 1509289702964.jpg (778x512, 45K)

...

GCN has a front end geometry bottleneck that limits high end cards because they can only process 4 triangles per clock, preventing high end GCN cards from ever being able to saturate the rendering pipeline in gaming.

That being said, if 7nm Vega is hitting 2400mhz clocks, a 50% increase over existing Vega 64, then a 7nm Vega gaming card with 56-64 CUs theoretically should be able to be ~40% faster than Vega 64 is now, since GCN's bottleneck is per clock.

GCN itself really only needs to be changed enough to enable processing more like 8 triangles per second than four. Just enabling GCN to saturate the rendering pipeline would bring large performance gains outside of process node advancements.

This might be my upgrade then from my GTX 980Ti. The Vega 64 was tempting, but too little performance for it's price especially given Nvidias recent price cuts on their GTX 1xxx series. Really really hoping that this new line of GPUs seriously performs. Get me a 7nm GPU, and sit on a Gen Ryzen/AMD GPU system until ryzen gen3 hits.

We saw what happens when Intel was finally properly challenged in the country market by AMD. I would love to see how Nvidia reacts/what kind of product they pump out if AMD becomes a proper threat.

It's not, you dumb fuck. It's better. They fixed massive frame-pacing and timing issues which original 7970 had. 280X is actually a usable product, unlike trash that is 7xxx line.

I thought they would fix this problem with primitive shaders but the vega design was broken to implement it via drivers. did I understand it wrong?

Literally from Pajeet the tech news outlet.

>I only just got my vega

I hope you weren't retarded enough to buy it because of how hard their CPUs are being shilled.

JUST WAIT(TM)

VEGA WILL SHIT ALL OVER PASCAL!
I'm not falling for AMD's Jewish GPU tricks again.

Attached: 1435117762555.jpg (645x773, 56K)

african made devices

Do you niggers believe in this shit like the Vega founders edition 1teraflopgiganiggas bogaloo we had a while ago?

Some people claim that Vega hardware was broken and that's why primitive shaders weren't implemented, but my armchair assessment is simply that Raja was so far behind schedule that the software team never got the chance to even try to implement that feature in software.

From my admittedly limited understanding, primitive shaders was a software solution that would have bypassed much of Vega's fixed function front end and done the work on the CUs, and as far as I understand, this solution doesn't require anything special hardware wise vs standard GCN.

My understanding is that they canned working on prim shaders for Vega because it would have taken so long to dev it in software that Vega would have been supplanted by a new GPU generation before they finished implementing it.

I shouldn't have just bought a Ryzen CPU when it was released but also should have bought some AMD shares.

we seriously have no clue, vega got some fantastic features which will be in next gen consoles I guess that's the reason why even nvidia went for fp16 in RTX

that's intel. AMD is BIGWangtech now.

I mean they make good cpu's now but we'll have to see if they can actually compete at making top tier gpu's.

Stop samefagging that pasta you faggot.

> >x1.25 performance of top tier Turing RTX, but much smaller and cheaper
> But no consumer cards
I mean, lol.

???

from raja
>The new geometry pipeline in Vega was designed for higher throughput per clock cycle, through a combination of better load balancing between the engines and new primitive shaders for faster culling. As a programmer you shouldn't need to do anything special to take advantage of these improvements, but you're most likely to see the effects when rendering geometrically complex scenes that can really push the capabilities of the hardware.
this was before the launch.

If you fall for the eternal waitfag line of logic, you might as well still be on your T60.

GCN 1.0 (Tahiti) was faster than GK104 and eventually was on the heels of GK110 at gayming performance.

People still got Kepler over it for whatever reason (Still believing the stupid "poor" drivers and "mah power consumption" memes)

Maxwell is what really put Nvidia back on top.

Right, but Raja so far behind on the hardware that the driver team essentially didn't even start developing the drivers until like 3 months to hardlaunch and they no time to even attempt to implement primitive shaders in software.

Lots of people assume that the hardware is broken on Vega for this feature, but since the idea foe prim shaders basically just describes using the CUs to do geometry and bypassing the fixed function hardware and early steps in the DX12 rendering pipeline, I don't see what special hardware beyond standard GCN would be necessary to implement the feature.

As I understood it, the problem was that re-writing much of the DX12 front end was a ton of work, not that Vega was broken in hardware in a way that made implementing that feature impossible.

No, they are the same silicon just different binnings.

The initial whole 79xx issues are AIB partners putting crappy VRMs that weren't up to the task. This was later addressed with rebranded versions.

Nvidia AIB partners did similar shit with Fermi SKUs. At least it wasn't as bad as infamous "Bumpgate" debacle.

They are going to RadeonPro, Instinct, Frontier only though.

You might expect to see some defective 7nm Vegas eventually going down the pipe but 7nm Turning refresh and Navi will be out by then.

well, Rys said prim shaders can be implemented into the software by developers if they wanted to, but it would not be on the drivers, making the process automatic for every software that want to use prim shaders.
your logic fits in this narrative. you are probably right.

Can confirm now, if you use Radeon GPU Profiler with a vega card, it creates code for prim shaders.

Fucking Yes please

Attached: 1528924939101.gif (670x473, 189K)

ITS NOT FAIR

Attached: 1080fire.png (1280x720, 1.14M)

>RX 480/580 is good.
Worse power efficiency and/or performance than the equivalent nVidia cards. RX500 series is just rebranded RX400.
Don't get me wrong, I would never buy nVidia but that doesn't mean AMD doesn't have disappointing cards.

even Radeon Pro should be cheaper than equivant novidia gaytracer so there's no reason to miss out on the 7nm GPU train

>equivant
equivalent
for fuck's sake

how are the 480/580 disappointing?

How is 580 not disappointing? It's just an overclocked 480 with less than 10% improved performance and a higher TDP. That's like calling 1080Ti a 2080.

>AMD shills still don't learn
(lol

Nvidia has 7nm GPUs too, Turing is just a stopgap to force people who didn't buy the 10 series to upgrade

tfw the 280x IS the 7970

its great for 1080 and adequate for 1440, performs better than the 1060, im failing to see where the 580 is a disappointment

Power efficiency maybe, but performance is pretty much the same as the cards they compete with.

>JUST GO OUT AND BUY THE RTX GOY
>DON'T QUESTION ME GOY, JUST GO SPEND $1000 ON THE CARD GOY
>R A Y T R A C I N G GOY

>Muh 7nm node
GloFo already quit, nvidia is fully capable of outputting 7nm GPUs any time because they also can uae TSMC's fabs.
After the Vega fiasco I jus got a 1080 and stopped waiting for eternity.

>15% faster than 1080ti

Attached: ev6FLSJ6R7u0vQB5aTObsOaYvF92IUyFnELfBghD6Cg.jpg?w=1024&s=a0b58b55194c90215a4f7f7f3fb8f12d.jpg (1024x572, 129K)

It's not a leap in performance which AMD needs to compete with Nvidia. Although maybe this mindset is exactly what caused them to sunk billions in GPUs while losing consumer market to Nvidia,

She's talking about increase in compute performance, probably not much help for gaming at all.

>RX 480/580 is good.
RX 580 is decent, let's not kid ourselves. I would never buy Jewvidia and I use an RX 580 myself, but let's not pretend that it is particularly good compared to the competition.

that would be more in line with Vega rather than Polaris, but at least AMDs get better through driver updates where Nvidia's get worse
also you're suggesting that AMD was focusing solely on GPUs when it's main focus was Zen and making semi-custom products (consoles) in order to stay alive
Now that the street shitter is out of the company, they can actually get some work done on GPUs, especially since 7nm Navi is on the way

what are you talking about? the 580 outperforms the 1060, how is that not good against the competition? I doubt you own a 580 because you're talking bullshit

>failing to see where the 580 is a disappointment
See >It's just an overclocked 480
It's not a new card and it didn't bring better performance. 400 series performs better in some cases and the performance difference is nonexistent if you overclock it. The """500 series""" shouldn't have existed, it's just 400+. They misled consumers into thinking 500 is not 400.

GCN is triangle bottlenecked

Will Vega 7nm still be GFX9? If so they already said native Prim Shaders won't be available to anyone outside pro devs.

they're is literally nothing wrong with the Polaris arch for a midrange card exceot for Nvidiafags spouting non-arguments

You fucking niggers don't know that both AMD GPUs and CPUs are the best mining crypto, so this means that cryptocucks are going full AMD GPU cards from now, making them even more expensive than Njeewia cards.

>the 580 outperforms the 1060, how is that not good against the competition?
Sure, that IS good, but it's not the whole story. Consider power consumption and the story is different.

>I doubt you own a 580 because you're talking bullshit

Extended renderer info (GLX_MESA_query_renderer):
Vendor: X.Org (0x1002)
Device: Radeon RX 580 Series (POLARIS10, DRM 3.25.0, 4.17.0-1-amd64, LLVM 6.0.1) (0x67df)
Version: 18.1.6
Accelerated: yes
Video memory: 8192MB
Unified memory: no
Preferred profile: core (0x1)
Max core profile version: 4.5
Max compat profile version: 3.1
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.1

do you live in a country where electricity is expensive? I was mining with my 580 24/7 and my bill never made any significant jump
All this talk about power consumption, when Polaris cards are way more power efficient compared to either Tahiti or Tonga, is bullshit unless you live in a 3rd world country where it's expensive

This has been proven blatantly false. Vega 56 was recently tested againt a 1070 and 1070 Ti and came out worse overall a year after release. The problem lies in the fact that pretty much all games are aimed at Nvidia hardware and Nvidia just gives better performance in DX11 with most devs just tagging on DX12 and sometimes Vulkan as an afterthought without putting in any effort to optimize it (DX11 in a DX12 wrapper is common). Without marketshare game devs just can't be bothered. It's money and time they just cannot afford.

>do you live in a country where electricity is expensive? I was mining with my 580 24/7 and my bill never made any significant jump
No, electricity is quite cheap in my country (Norway). I don't have any problems paying my electricity bill due to this card, and I can't say I actually notice it.

However, when comparing Nvidia's and AMD's offering I am speaking generally, and power consumption is a real concern for a lot of people, even in Europe.

>is bullshit unless you live in a 3rd world country where it's expensive
You don't have to live in a 3rd world country for the energy bill to become a concern. Consider Central European countries like Germany, Belgium, Denmark etc.

you would only be oaying the difference of MAYBE a couple of Euros a month beyween a 1060 and a 580
the power consumption argument is much ado about nothing that was started by the Nvidia marketing department as a selling point
If you actually breakdown the power consumption between the two, you'll find there's a marginal difference
if it's been proven blatantly false then surely you can provide some sourced examples proving your argument, yes?

This translates to "I wonder who is behind this post"?

So what's with posters being lazy and leaving out the hands on this meme now?
|
|>
|
|3
|

I just found this at ifa.

Attached: 63CA02DC-C02F-4CA8-AE5F-B3C9A8B1E2B4.jpg (4032x3024, 2.52M)

AHAHAHAHAHAHAHAHAHAHAHHAHAHA

big if true

Vega 56 owner here. But I flashed my ref card to 64 BIOS and tweaked up so I get closer to GTX 1080 perfomance anyhow (It's still a bit bit juicy and hot on the electric though despite that).
youtube.com/watch?v=Vj9_-fO46bQ&t=557s

fucking epic

>RX 580 is decent, let's not kid ourselves. I would never buy Jewvidia and I use an RX 580 myself, but let's not pretend that it is particularly good compared to the competition.
I've been thinking about buying an RX 580, what's a better alternative?

If you only run at 1080 and can keep it 60-144 FPS on a 144Hz monitor a GTX 1060 6GB is fine. You won't notice any tearing with vsync off. RX 580 is OK but uses more energy and produces more heat.

BTW ultra settings is a meme. But you can choose ultra and tone down some stuff to stay above 60 FPS. Use some software that displays the FPS to check your minimum and maximum framerates.

Does it really matter what brand of GTX 1060 to get? I don't know about any brand war but this one looks good pccasegear.com/products/36449/asus-geforce-gtx-1060-dual-fan-oc-6gb

thats just sad

Wait but what did Steve Job's wife say?

could you pick a better video? all it is is the presenter telling you not to buy the card and how improvents haven't come via driver improvements without actually showing proof