Wasting prefectly good 7nm silicon supply on garbage 5th gen recycled architecture that can barely compete with 12nm...

>wasting prefectly good 7nm silicon supply on garbage 5th gen recycled architecture that can barely compete with 12nm GPUs from 2 years ago

Why did AMD even fucking bother?

Attached: pD7MvXSNEPnVRaxcuhRMXV.jpg (1920x1200, 402K)

To make something I suppose. I honestly wish they would have just kept up production on their ega cards to try and make them cheaper. They be stealing the show if their Vega 56 was able to be bought for $250 or less. Or their 64 for $350. Phase out their current RX580/590 stuff. While they do that, they could have been working on a new proper card. Instead we get yet ANOTHER GCN arch based shit pile. And now I'm hearing Navi is supposed to be GCN based too? For fucks sake.

i-it's j-just a stop gap .... w-wait for n-navi

>they made a card that has similar performance to a card with same price but with use cases where it dominate it

What's with the butthurt?

How good is this shit for mining crypto?

recycled mi60s

I went and bought Vega 64 and I like it. Three fans so it's not hot (Sapphire). Here's hoping the FineWine™ will continue to happen. I probably won't even need Navi unless it's going to be awesome. I don't have any problems turning graphics down to get good performance. I'm a casual player.

Attached: 1534625333650.jpg (512x288, 22K)

This legitimately makes me sad

Nobody does that anymore.

It is perfectly fine for general compute purposes. You retard. Radeon VII is a steal for general compute hobbyists.

Gaming discrete GPUs are becoming more and more irrelevent. High-end "gaming" SKUs are just enterprise-tier rejects.

why does this piece of shit need 300w?

>Gaming discrete GPUs are becoming more and more irrelevent

lol no

if you want shit tier apus you might as well buy a console that's at least sold at a loss

This kid is not entirely wrong. Top shelf GPU chips are now oriented towards the much more profitable enterprise market, hence the slow progress on gaming performance. If you compare stuff like Tensorflow performance between the three last generations of GPUs, you'd be surprised with how they've gotten better.

Yes, they are becoming more irrelevant. 2020s is the decade where discrete gaming GPUs meet the same fate as discrete audio cards in the 2000s. iGPUs have already murdered bargain basement basic 2D/3D SKUs. Mainstream SKUs are next on the chopping block. They make up the bulk of the discrete gaming revenue.

The whole RTX mode is Nvidia gambit on keeping discrete gaming GPUs relevant to the masses. So far it has very little success.

Why do you think Nvidia has trying to move away from gaming GPUs as their bread and butter since Fermi?

>/v/tard detected

For FP32/FP64 computing on a budget.

What are the concrete applications for that? I heard some guy saying that this card would be great for some kind of research applications, but I don't know what field.
Also, what about video editing? I guess only for freelancers if anything, since studios pay for pro cards and licenses.

>2020s is the decade where discrete gaming GPUs meet the same fate as discrete audio cards in the 2000s.

Capped. See you on Jow Forums in 2030 faggot

>waiting for a midrange card

>if I keep spamming it it must be true!

Exactly. They’ve been saying that shit since 2004. As if we’d be in 1080p60 forever or something.

Short-term, the only thing that AMD could do before the release of Navi, is:

release a cut-down Radeon VII with 2 stacks of HBM, 8GB of RAM, and increase memory speeds of those stacks to 307GB/s per stack (maximum in standard JEDEC).

It would be about 614+ GB/s which is enough for it to run significantly faster than Vega 64.

This is still kinda shit because It could only be priced at 550$ and while it would beat out the 2070, it would get shit on for not having "RTX" or "DLSS" despite these "features" having been used awfully as of march 2019.

Metro has decent RT features but DLSS is trash on everything DESU

basically just a marketing trick to give an illusion of still being a relevant GPU manufacturer after years of releasing turds

because dies will keep shrinking and silicon budget is unlimited AMIRITE?

>Why did AMD even fucking bother?
They didnt put significant resources into making this. These are rejected enterprise dies. The only reason it exist at all is because of the Navi delay and Nvidia jacking the prices up enough for this thing to break even.

Attached: 1457574344723.png (359x414, 150K)

Navi arch has now been quoted as inhabiting all future product releases, although initial release will definitely be mid-rage, don't want to piss off your loyal idiots who bought Radeon VII

AMD could release a high-end GPU if they wanted- If AMD has finally broken the 64 ROP limit, they could release a 500-600mm2 monster that could potentially obliterate the 2080 Ti by 30-40% while costing 200-300$ less albeit without Raytracing

That's extremely unlikely due to AMDs poor track record, though.

>AMD could release a high-end GPU if they wanted- If AMD has finally broken the 64 ROP limit, they could release a 500-600mm2 monster that could potentially obliterate the 2080 Ti by 30-40% while costing 200-300$ less albeit without Raytracing

if they could, they would.

Marketing. It's a paper launch while preping for Navi.

>Marketing

Yeah because its such a good marketing idea for AMD to release a huge POS like VII while people wait for a mid range replacement in Navi

Good VR (high resolution/framerate) will keep high end GPUs relevant

I doubt they could get the original vega cards that much cheaper, since hbm costs so much.

VR will always be a niche like dedicated joysticks, flightsticks, driving wheels.

Normies don't care about VR. The novelty died within months.

just give me 2070 level of perf for 350 dollars

>Normies don't care about VR.

Yes they do. They just don't want to build a dedicated PC for it. GearVR and PSVR did very well. Beat Saber is like the ultimate normie workout game like the Wii Fit.

>>wasting prefectly good 7nm silicon supply
But it wasn't that's the point. That silicon wasn't good enough for datacenter, so not selling them to consumers would have been the actual waste.

Nope, 2020s when it begins to unfold.

Gaming audio-fags back in 1990s would have laugh at your ass if you said that discrete audios cards would be practically be gone towards the end of the 2000s.

It is already happening with discrete GPUs. Gaming consoles are what is driving it. They are setting the baseline for majority of gaming titles. Normies are more than content with 720p/1080p gaming. They don't care for "Ultra" ingame details or AA/AF at 4K BS.

Keep trying to rationalize that $2K-3K rig silly /v/tards. You now can get nearly the same gaming experience as a console or $499-899 PC "toaster".

Many of the oldschool hardcore PC gamers have already jump ship or moved into greener pastures.

Not going to be happen, since Radeon VIIs are just rebranded Instincts that ate too much power and enterprise/ISVs didn't want them.

cope harder

>You now can get nearly the same gaming experience as a console or $499-899 PC "toaster".

Had me going until that. 2/10

Wrong kiddo, normies don't care about that shit. Only hobbyists care about VR. Most of the early adopters already sold off their headsets once the novelty worn off.

Sure sure bud. That's why VR sales are shrinking right? Oh wait

Attached: Capture.png (1041x674, 411K)

/v/tard coping harder that console-fags and toasters are enjoying the same games without spending $2K-3K on silly gayming rig for a marginally better experience not for real work.

@69898404
Too obvious. Doesn't even deserve a (You)

Hobbyists, weebs, porn-fags are the only people getting that VR shit.

That sub 60 / 30fps experience sure nearly feels the same. I feel cheated.

Attached: console competitive gaming.png (1559x826, 2.75M)

There has NEVER EVER been a sound card scene 1/10000 of the size of the graphics card scene, even if you try to adjust the numbers.

>they sold 12 of them this year instead of 6!

Attached: you.jpg (276x183, 11K)

>enjoying the same games
At 1080p@60fps Lowest Settings and disabled MSAA, right?

>buy 600 dollar graphics card
>I don't have any problems turning graphics down to get good performance

Attached: 1381107641720.jpg (1239x813, 341K)

Yes, there was back in the late 1980s onto the 1990s.

Digital audio processing and output was a massive game changer like dedicated 3D graphics processing was back in later portions of the 1990s.

The difference is that audio processing is much cheaper then 3D graphics processing. It didn't take nearly as much miniaturization to make discrete audio cards obsolete.

3D graphics are about to become small and cheap enough that you don't need huge-ass silicon and dedicated "daughterboards" anymore for mainstream gaming needs.

Miniaturization has been the heart of digital computing. There's nothing special that makes discrete GPUs immune to its effects.

>FineWine
This means the initial drivers aren't capable of utiliting the hardware's full potential, right?

You have no

need to keep typing

like this you can just

type like a normal person

regardless, my point is no matter how

big was the audio scene back then, it still wasn't

1% of the size the GPU is nowadays, even corrected by the year

you can look this up

Progamers and FPS junkies already do that shit because "Mah framerate" and it removes unnecessary "visual noise".

You keep falling for the "Ultra detail" trap which itself looks marginally better then high detail at at great performance cost.

If it made the other players more visible or got you a steady 100+ fps I could see doing it, but it usually just makes the game look like dog diarrhea and most people's monitors are locked at 60 Hz anyway.

Discrete audio was huge back then idiot. The entire PC market in itself was much smaller because it only existed in NA/proto-EU and it was hobbyists only. Creative used to be a monster in its heyday.

Normies didn't care about PCs until 2000s and by then discrete audio was on its way out. They make up the bulk of the gaming PC market right now. The new blood is coming entirely from untapped Chinese/Indian markets.

>It is already happening with discrete GPUs. Gaming consoles are what is driving it. They are setting the baseline for majority of gaming titles. Normies are more than content with 720p/1080p gaming.
And yet even in consoles you're seeing revisions with nothing but beefed up GPUs for the sole purpose of playing in higher resolutions, eking further towards an elusive 4k that they aren't even remotely close to yet.

>The new blood is coming entirely from untapped Chinese/Indian markets.
Neither of whom do the whole "buying and owning a computer" thing because they have PCbang equivalents

Cope harder :)

I think we're one good leap away from that being a full reality (5nm), and one INSANE jump in memory technology.

It's entirely possible 7nm / 10nm APUs / iGPUs are going to cut out the sub-150.00$ price range of cards.

They'll still be a massive amount of people wanting 200$+ GPUs. Unlike Sound cards, which don't actually require much processing on a chip (probably less than 2mm squared on 14nm), GPU rendering techniques are still far off from being anywhere near "complete".

The ONLY way APUs or iGPUs are going to be competitive above the 150$ price point is if newer chipsets start incorporating a Graphics RAM slot, HBM2 on the die, etc.

Even at the maximum bandwidth of 230 GB/s of DDR5, it will still be shared and thus maybe on par with a base PS4/Xbone. This is around the same time as the PS5 and Xbox Scarlett are released, of course. No amount of ESRAM or EDRAM is going to suffice. We'll need a new memory standard before APUs get real.


It also means that AMD continues to release drivers for their older cards, instead of just letting them melt via lack of proper driver support.

Just look at the 1080 Ti, superior to a 2080 in just about every way in games that don't use RTX, etc, Nvidia hamstrings it by simply not making drivers that take full advantage, of course when you turn up games to 4K the 1080 Ti still stretches its legs, but still.

This. Unless you wanted to listen to the beepy PC speaker all day you were lining up to buy a Soundblaster or other sound card. And they weren't just DACs back then, they had various incompatible interfaces to render MIDI-like music too.

>even adjusted by the year
You still need to address this. No, the discrete audio could be huge for the time, but not to the same proportion that GPU is today. Regardless of that consoles always outsold GPUs, your argument doesn't hold from this angle either. Integrated graphics from laptops matching consoles are not some hot news, it's been happening since forever. They could match consoles since the PS3 at the least and they still can, your point is moot. This state of affairs isn't going anywhere. You're just speculating based on what happened to audio, whist ignoring that graphics are much more complicated than sound.

/v/tard detected who is anal devastated that console-fag have nearly the same experience on their "toaster-tier" hardware.

> has 9700K and Vega 64 and that aside from high-framerates there's no real difference between it and a gaming console for most games (because they are console ports!)

>implying 1080 Ti doesn't still get fixes and improvements for new games

They don't though. Console "4K" is just sad compared to a 2080 Ti.

Frame rate is not important.
And stuttering is fine.
Don't buy a dedicated graphic card.

Exactly, and good audio processing doesn't require nearly the space on a die when compared to the space on a die good graphics processing uses.

If there's a real, tangible, and large difference between an APU and a Graphics Card, people will be tempted to buy the more expensive, better looking card.

Hypothetical: Maybe they'll be some crazy advancement in GPU technology, where all of a sudden we can get the same amount of rasterization in 1/5th the GPU die area, but even then, that will also translate into 5x the performance of a discrete card, and that will require 5x the memory bandwidth to power it.

Audio =/= Graphics

Graphics are just more expensive on computing resources not more complicated than audio processing.

It is just taking a lot more miniaturization but it'll certainly get there unless Nvidia's RTX mode gambit pays off.

2080 Ti with DLSS is Console "4K"

Don't fool yourself. Nvidia is selling a console feature to sell their terrible generation of graphics cards, end of story.

Amd hasnt had any good offerings since the 7970. The 570/80 sort of by pure budget merit

$99-249 discrete GPU segments are what next generation of iGPUs are start eating away at. They make up the lion's share of the gaming discrete revenue.

That's why Nvidia is really hoping that RTX mode kicks off, Otherwise, they are going to be experiencing the effects of demand destruction.

High-end Discrete GPUs will continue to endure (they will be just enterprise-tier hardware "rejects")

The 290 absolutely destroyed the OG titan for 600.00$ less.

The problem is, you complete and utter giant fucking retards kept buying green even after the AIB cards hit the market. It was pretty clear to everyone Nvidia had lost multiple generations in a row at that point.

5000* vs 200 series
6000* vs 400 series
7000* vs 500 and 600 series
200* vs 700 series

It wasn't until Maxwell that Nvidia started to dominate.

Before that, people bought Nvidia based solely on superior marketing and ignorance.

Of course there's no real difference between this and a real gaming PC.

Attached: Apex Legends on a RYZEN 2200G.png (374x119, 110K)

>Nvidia RTX mode gambit pays off
Oh, I understand where your ignorance is coming from. Nigger, Nvidia didn't invented Ray Tracing. This is just ONE technique of tracing rays out of multiple. It is the future of graphics and Nvidia is innovating by pushing this technology, regardless of how faulty (speaking strictly from the point of innovation here) the implementation is being. BFV was pitiful, but the lightening techniques on the new Metro game is indeed ground breaking. Movies are all ray traced dude, it was and it is a matter of time until it becomes more mainstream. And we're nowhere NEAR to real time full scene real time ray tracing, we're only dealing with lightening and shadows at the most and most GPUs are already shitting the bed. It's this misconception that makes you think we're stagnating. Because, yes, the Rasterization performance has slowed down a lot because we're reaching the limit of this technique. Not all graphics need to be rasterized and for we to get better pic quality, they can't.

Attached: 1548807208862.jpg (617x514, 68K)

>$99
Maybe

>$249
Lol no. Even with DDR5 RAM.

Wrong.

1. As soon as memory bandwidth becomes a limiting factor (currently sub-100$ dGPU), APUs fall to shit.

2. Nvidia literally released RTX as essentially full-on theft. They didn't have any 7nm designs ready, so they called their TSMC 14nm cards "12nm" and threw on tensor cores and an RISC-V "RT" core cluster (completely open source btw) 7nm Nvidia will come out just as soon as Navi finally gets released, and it will do +50% minimum on all existing RTX cards for the price at 7nm.

3. Mid-end Discrete GPUs will also endure, which are consistently making up a larger and larger amount of the overall dGPU market.

For perspective, on the left we got a really popular game that isn't pretty but it's not specially ugly. On the right we got a ray traced movie... from 2005. We're nowhere near the second picture 13 years later.

Also, forgot to quote

Attached: Rasterization vs Raytracing.png (2684x755, 3.09M)

You idiot, RTX mode is just Nvidia's way of utilizing the tensor units on their silicon. Tensor cores are still fucking when huge compared to shaders/TMUs/ROPs a.k.a they can't be thrown onto an iGPU.

They are hoping that their meme version of ray-tracing will stall iGPUs from eating away at their discrete GPU marketshare.

Protip: Real-time Ray-Tracing will never become mainstream for 3D graphics without sacrificing resolution, model/texture quality (Nobody wants go back to 320x240/640x480 Quake 1/2-era models/textures). It is simply too expensive. It will remain professional-only.

OK Juan.

Going out on a limb here but I would say we will never get full-ray tracing consumer products in our lifetimes. Currently we can path-trace (not ray-trace) Quake 2.

People aren't going to ever switch to full-ray tracing because it's simply too expensive on our hardware, even if next-gen RTX cards are an additional 10x at Raytracing that still puts us about 50-100x off from running modern games with all of those pretty textures without any form of rasterization.

consumers will continue to want raytracing, but they won't give up textures, geometry, or resolution to do it.

dGPUs aren't going anywhere

From one side we have your argument that relies on believing in a conspiracy from Nvidia that can't be proved, and looking at the market makes even less sense (Intel just joined the dGPU race you dunce), from the other we have a graphic technique from the 70's that hasn't been used in movies since the 80's and it seems like the natural evolution to graphics (be it ray or path tracing as says).

Saying "we will never get X" it's just a long winded way of saying I don't know. I don't either, but I won't say what will or will not happen because technology as a "field of interest" is borderline unpredictable. The rest of your post I agree with you. Also, I'd point that my argument also works for path tracing, or any other form of rendering that doesn't relies on triangles.

You’re too dumb for this card. Go back to school.

We're nowhere near our limits on rasterization as it is. The general public has been misinformed due to bad "diminishing returns" bullshit examples.

As soon as the hardware is available to the general public, designers will take advantage of whatever new technology is available.

There's literally no reason for Raytracing to not go mainstream in 2020. Hence there is no reason APUs will be succesful at 200$+ in the next 5-10 years.

Why is raytracing going to become popular?

The answer might surprise you, it's actually due the lowering of development costs. When shadows and lighting elements can be done realistically without the need for a dedicated art team, game designers will be able to save millions of dollars on new game titles. That's the real benefit of ray tracing at least in the immediate 5 years.

Intel is only joining the dGPU game because they are making prototypes for their next-generation iGPU platforms + dedicated GPGPU units.

It is counter-attack against Nvidia's encroachment in the HPC market while simultaneously going after their lifeline (low-end and mid-range discrete GPUs)

>take defective radeon chip and sell it for a couple hundred bucks less and pass it off as a gayman card
its literally that simple

To scam stock investors. They realistically don't have anything GPU wise until they either launch a new architecture or they just close up shop.

They're just re-branded Instincts that couldn't make the cut on the standard line due to various reasons, and instead of just straight up tossing them out they're re-purposed as gayming cards.

>A card is priced and peforms similarly to one from Nvidia
>this is somehow bad

Yeah stay mad user. To quote Buildzoid, "AMD does not exist to make Nvidia and Intel products cheaper". OP is just a PCMR retard who gets salty they are reaping the benefits of blind fanboyism for the last 5 years.

When the 290x killed the titan "lol AMD is too hot and loud!!!!!", when the titan z's pricing was announced "lol its a one off, Nvidia isn't really raising prices!", when the 970 released "fuck the 390, power draw is THE ONE DEFINING METRIC of a gpu" etc etc. Nothing matters until Nvidia says it does and then anons flock to it like flies to shit.

Now AMD has a competitive product (and the previous vega lineup was competitive but fair enoigh availability was an issue) you are salty its not faster than Nvidia for significantly less money.

Attached: apex-legends-multiplayer-benchmark_1080p.png (865x779, 52K)

Has it been confirmed yet whether or not SR-IOV is disabled?

Regardless, what exactly did they waste here? These are recycled mi50s, so nothing was wasted. If anything it was a smart business decision.

ask nvidia to, they can sell the 2070 for 350 and still make a hell of a profit

AMD’s drivers have largely improved over the past months, which is perhaps why it’s so disappointing that the Radeon VII drivers are so riddled with bugs. The company has worked hard to eradicate this perception of bad drivers, and has done well to fix its image and its driver packages, but botched the entire thing in one go with Radeon VII. Here’s a small list of what we encountered – we didn’t write all of them down:

Occasional black screen & restart issues (full stock, no OC applied). Suspected related to ASUS motherboards
Black screen / lock that requires hard shutdown (full stock, no OC applied)
Stock/auto/out-of-box crash event during benchmark triggered hard reset, ultimately killing the ability to open Radeon Settings on the system. DDU and AMD’s clean uninstaller did not remedy the issue. “Driver gremlins” left behind, post-crash, completely broke AMD drivers. We re-imaged the system to bypass the problem.
Some games hard crash, like Ghost Recon: Wildlands
Some crashes cause fans to lock to 100% fan speed until power button is held/system is cold booted
Manual overclocking seems to not do anything
Power offset sometimes does not work (validated with power meters and clamps)
Cannot adjust fan speed to 90%, but all other ranges work fine?
Fan speed sometimes gets stuck at 100% and cannot be lowered, could not determine root cause
Clock occasionally misreports, e.g. as “7800MHz”
Crashes during OC stability testing can sometimes wipe-out drivers and require a clean reinstall as Radeon Settings will stop opening
Performance monitor sometimes does not log for more than a few seconds on some installs (but works on others – root cause not found)
Stats read-out in Wattman sometimes completely disappears, seemingly without reason (even under stock/unchanged settings)
Fan options sometimes revert to old version (min/max RPMs rather than fan curve), seemingly without reason

lmao you wish. this is AMD we are talking about. last time they did anything close to that was the unlockable 7950

because you can't make any better, faggot

*laughs in 4.5mil in PSVR sales*

The "fixed" 19.2.2 drivers wil lstill misreport VII's clock settings. That is unless my card really ran at 4.1ghz briefly.

It's time to abandon GCN cores and HBM for gaming. HBM is basically useless, and videogames still fail to utilise the compute potential of GCN. Why can't they just keep making vega/navi with HBM for workstation applications, and bring out a new generation of dedicated gaming cards that use gddr6 and an arch similar to cuda? That way we wouldn't have to overpay for expensive ass HBM that has zero benefit in gayming, and big GCN cores that will never be utilised properly.

>45% of nvidia's income is from gaming GPUs
>bbbbbut they're dying!!!!111!!!!one!!

Get a load of this idiot.

Wow, wow, wow, hold the fuck on there, user. I can understand why GCN should be abandoned, but keep your grubby smelly nigger paws off my HBM, you little shit.

What's stopping you from just buying a workstation card with HBM? Why should everyone overpay for an expensive gimmick that has been shown time and time again to have absolutely no benefit in gaming, just so user can circlejerk on the internet about his memory bandwidth?

No it means that AMD doesn't intentionally sell you gimped products like the 3.5gb meme and the 6gb for the 2060 and they're cards usually have more raw power. For example the vega 56 btfos a gtx 1080 in vulkan games. There just aren't many vulkan games yet.

...