Vega was either a lie or broken on day 1

Vega was either a lie or broken on day 1.

mail-archive.com/[email protected]/msg24458.html

t. Vega 56 owner

Attached: 13758984_f520[1].jpg (520x293, 7K)

Other urls found in this thread:

gpu.userbenchmark.com/Compare/AMD-RX-Vega-64-vs-Nvidia-GTX-1080/3933vs3603
anandtech.com/show/5775/amd-hd-2000-hd-3000-hd-4000-gpus-being-moved-to-legacy-status-in-may
anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy
twitter.com/NSFWRedditVideo

I love my Vega 56 you're just a retard

So do I but it pisses me off that it could have been so much better with the full feature set.

blame raja pajeet, that fucker was the one that decided to ship shit that wasn't ready to ship (there are hardware faults that software cant overcome to add proper functionality)
i wonder how intel is going to keep him around given his track record and how they are doing so poorly on badly designed cpu security

And Nvidia just delivered their own Vega

Tbh I would pick a Vega 64 over a 1080 in a heartbeat if I can fucking get my hand on one.
t. GTX 1080 owner

His track record is solid besides Vega and R600.
That's not the point anyway.

no it isn't. He released garbage non stop

just buy one..

>That's not the point anyway.
yes it is, he is the sole cause of day 1 vega being broken, as stated in the op.

No point in getting a Vega 64 if you already have a 1080, even if you have a freesync monitor.
t. Vega 64 owner.

This GPU gen is fucked.
Wait for 7nm.
Again, GPU fuckups are never a fault of one single dude.
nv30, R600, BDW and CNL GPUs happened because fuckups happen.

>just buy one..
Couldn't find one in this shit hole when my last card died. And I already sold my soul to the green devil.
Fuck the gsync tax though. I'm literally losing sleep thinking about my monitor upgrade.

My monitor, LG34UC89G cost 550€ for the Freesync version and 950€ for the Gsync version.
I can't imagine what getting cucked by Njewdia must be like today, considering I got Vega 64 for MSRP.

>when the boss says "ship it, do it now!" its the employee faults
oh, cmon man, the rest do as they are told and thats it, even neets arent this stupid

Again, uArch fuckups are never a fault of a single man.
See R520 or R600.

>an advanced feature is coming to the pipeline
>being made by a small team

VEGA WAS BROKEN

fucking hell you are idiots

A 64 gets ass raped by a 1080 and is far more powerhungry, why the fuck would you buy a worse card?

They won't ever implement it for Vega (gfx9).
It's either broken or they're lazy.
Or both.

>Vega was either a lie or broken on day 1.
Correct. Raja made up Primitive Shaders out of thin air to pretend he had some magical solution to not bothering to even attempt to correct GCN's front-end geometry bottleneck, but Raja never allocated any software dev time whatsoever to the feature and it never really existed, even as a prototype.

In fact, Raja used all the time since the release of Fury sitting around stroking his dick and collecting paychecks, and Vega turned out to just be overclocked Fury with no other meaningful changes.

Actually, if you look back at how rough the drivers were at launch, its pretty clear they didn't even start working on the basic Vega drivers until absurdly late in the game, and that Vega was hardlaunched on pre-Alpha tier drivers because Raja had been stroking his dick the whole time.

Good luck to Intel's GPU division. I hope they enjoy having Raja pull the classic Indian project lead or programmer move of desperately pretending everything is fine despite all internal evidence until hard-launch when the scale of the disaster is revealed

As for future AMD GPUs, AMD will never be able to have a high end card stretch its legs until they fix the front-end geometry bottleneck in GCN, which remains limited to 4 triangles per clock, and which hopelessly bottlenecks Vega.

With the coming node-shrink to 7nm for Navi, you should assume Navi will be exactly as much faster as than Vega as it clocks higher, and nothing more, because they aren't going to fix or replace GCN in time and so the same bottlenecks that hobbled Fury and Vega will still exist.

Vega refreshes ought to be pretty good. But really I'm just waiting for the next AMD/Nvidia GPUs since this batch seems to be a dud.

>Believing the lies of AYYMD

TOP KEK

the Freesync version has a different panel and a 48-75hz sync range lmfao

Vega is literally feels like fiji overclocked a little bit.

That's exactly what Vega is. It's just Fiji at higher clocks because Raja didn't even *try* to fix the obvious glaring problem that bottlenecks all high-end GCN gpus.

The entire performance improvement from Vega to Navi will basically be a function of how much higher Navi clocks on the 7nm process, because once again, they aren't fixing GCN's fundamental issues.

i have been interacting with gypsies for 35 years, everything a pajeet says is a lie, sometimes small sometimes big, but a lie nonetheless.

never trust a street-shitter, specially one so dodgy looking like raja poopduri

You are too pessimistic. Pascal is a very efficient design, bu it can't be expanded without adding cuda cores. That is die space which nvidia does not have. Nobody will make 800sqmm chips and expect profit. We are maxed on the gpu side, whenve 20x0 went to meme tracing and we are waiting for die shrinks.

GCN is fundamentally a good uarch, but imbalanced in the direction on compute. Which I blame Raja for. If AMD can balance the chip and pull an ebyn mcm manouver on it, them they can make a killer chip. Essentally, make a gpu out of several dies mcmed and transparent to software, and you are limited by power consumption only. And it can be done because 3d in so easily parallelizable.

It's not pessimism to point out that GCN will remain hobbled at the high end until they fix its front-end-geometry bottleneck of 4 triangles per clock.

Its more like optimism, really, because it recognizes that if AMD ever fix this problem with GCN and finally allow their high end gpus to stretch their legs, they will gain a ton of performance.

It's just depressing to realize that Raja literally wasted all the potential development time that could have been spent on finally fixing GCN's front-end-geometry bottleneck between Fiji and Vega and ended up delivering nothing more than Fiji at higher clocks.

Buy the card for the preformance it offers now
the 56 was a good buy for the price.

I honestly cant wait for a few years down the line for someone to come out and tell us what happened in frank details.

also, a good bit of my understanding is vega was launched to just have a launch as mid development 2/3-3/4 of the engineering team was moved to navi.

where things are good or bad is very up in the air.
the 56 has the best balance,
the 64 pulls ahead some times but is largely bottlenecked
primitive shaders SHOULD be in, but are api calls which have to be programed for, how often do devs do that for amd hardware?

either way, cant wait for navi, as im not paying money for sub 8gb of ram and im not paying nearly 1000$ for a gpu.

Basically, if they could fix the 4 triangle per clock front end geometry limitation on GCN, they wouldn't even need to release new cards. There's a ton of performance on Vega that's not accessible in games now because of that front end bottleneck.

For Navi, the performance gains will basically come down entirely to how much faster they can clock it on the 7nm node, since the bottleneck is 4 triangles *per clock*, but it would be far better to fix the fundamental problem instead!

Sony paying for Poovi HOUSEFIRES garbage because AYYMD isn't willing to pay for its development, TOP KEK

What do we actually know about Navi?

>assraped
nope, it is on par or above.

we dont know jack shit about navi

if amd actually use the a.i to offload random stuff on their sp that arent working then we will probably see a massive gain also

>no it isn't. He released garbage non stop
Keep pretending that dumb faggot

I won't believe they will deliver anything more than a performance improvement approximately proportional to the increase in clock speed until they publicly demonstrate empirical proof that they have addressed the 4 triangle per clock GCN limitation.

it's like you niggas don't know this website exists.

gpu.userbenchmark.com/Compare/AMD-RX-Vega-64-vs-Nvidia-GTX-1080/3933vs3603

If Navi is really mcm they don't really need to fix that, four chiplets with 1024 cores would manage 16 triangles per clock.

>Raja made up Primitive Shaders out of thin air to pretend
The fact that it exists and works on CAD software means it isn't made out of thin air
RTG just didn't devote resources on using it outside of a few workstation applications, not even after Raja left, guess Lisa knew that they just didn't have the mindshare to make it worth it on gaming
>and Vega turned out to just be overclocked Fury with no other meaningful changes
Packed math, HBCC and primitive shaders, all of them working on a few select applications, prove you wrong
>Good luck to Intel's GPU division. I hope they enjoy having Raja pull the classic Indian project lead or programmer move of desperately pretending everything is fine despite all internal evidence until hard-launch when the scale of the disaster is revealed
Raja's track record it's pretty impeccable outside of Vega, ArtX being acquired by ATi back in the day made ATi competitive against Nvidia
Intel, with their non-poojet leads, has fucked up way more than anyone else on the industry ever did, when management it's fucked there's no way around it and it isn't exclusive to Rajeet
He mostly did on workstation software

The thing about primitive shaders that was originally ground breaking is that it was supposed to be done automatically. It should have been an out of the box performance increase free for everyone.
That never happened, then AMD said it would just be something that developers could enable.

Ok, lets summarize here.

Vega 56 and 64 should have had a 25% decrease in vcore, making them use much less power, and lose about 5% performance. but the end result would have been a cool card that did not beat the 1080 but was close to it. at least then no one would have called them a fucking housefire.

on the other end of the spectrum.

Jewvidia should have released the 2080TI at the 999$ price point and not the 1199$ price point.

Now if the rumors are true that 2080ti is just 20% faster than a 1080ti, then why would you even bother buying this card for twice the price of a 1080ti? for memerays? The BF5 demo was not that impressive. If they had released 20xx series in a better price bracket, this would totally fuck AMD over. but as it stands now, 1080ti, 1080 and Vega cards are still viable. (vega if you just want AMD because you hate nvidia or w/e reason you have)


The point is Nvidia fucked up their pricing.

a Ref 2080ti here in Norway costs: 1550$
A non ref card costs 1650$

While a used 1080ti costs 535$ and a used Vega64 costs 350$. And this is off the mining craze ending, meaning the market will be flooded with 1080tis and vegas in a months time making them even cheaper.

Attached: 63155e66b0191956bf2fa7bde3f20194_original.jpg (680x1350, 294K)

>trust pajeets
Found your problem.

for vega, we don't know how it was fucked, if it was software side or hardware and the one game that had primitive shader api and all amds shit was far cry 5, if it was only the input geometry, vega 56 should be matching a 1080ti and 64 should be above it by a bit.

Like i said knowing where shit would fall will be interesting in the coming years.

as for navi, if amd did abandon vega before it was done, navi may have had everyone put on it so early to deal with bottlenecks at the very least, maybe something was fucked on silicon for vega and that's why it was under performing.

at the very least, if navi is vega but on 7nm, its likely going to get a 20-30% uptick in performance just by virtue of the die shrink.

He, and RTG under him, literally made no attempt whatsoever to implement a core feature of Vega, and the key one that would have been critical to Vega achieving anything like the performance it should have in gaming.

Hell, the pre-alpha state of the drivers Vega hardlaunched on suggest they didn't even start the basic drivers until absurdly late in the game.

Honeslty, I have no remote fucking clue what the RTG software development team(s) spent the time between Fiji and Vega working on considering they had to cobble together a Fiji based driver a few months before hardlaunch to get Vega to run at all.

The bottom line is that RTG needs to fix GCN and eliminate that front end geometry bottleneck and let their cards stretch their legs in games if they ever want to have a shot at gaining mindshare back.

>for vega, we don't know how it was fucked,
Yes we do. They never implemented primitive shaders for games, and that would have been required to overcome GCN's 4 triangle per clock front end geometry limitation. That this 4 tris per clock limitation is *the* bottleneck in high end GCN based cards has been known since Fiji.

This is empirically observable from the fact that Vega is faster than Fiji by just about exactly the same proportion as its clock speed is higher, since that increases the number of triangles per second the front end can process.

Yeah, but working, even if not as promised, isn't the same as exactly the same as Fiji or made out of thin air
It exists and works, even if it is meaningless for muh gaems
>He, and RTG under him, literally made no attempt whatsoever to implement a core feature of Vega
It's implemented on CAD software
>and the key one that would have been critical to Vega achieving anything like the performance it should have in gaming
Yeah, they dropped gaming out of pretty much every Vega feature, guess they did that since pretty much no one buying Vega did it for gaming
>Hell, the pre-alpha state of the drivers Vega hardlaunched on suggest they didn't even start the basic drivers until absurdly late in the game.
Primitive shaders and HBCC were working at launch on CAD software
>Honeslty, I have no remote fucking clue what the RTG software development team(s) spent the time between Fiji and Vega working on considering they had to cobble together a Fiji based driver a few months before hardlaunch to get Vega to run at all.
Vega was demonstrated with Fiji drivers 6 months before hardlaunch running Doom, it wasn't cobbled a few months before, but at least two quarters before hardlaunch
>The bottom line is that RTG needs to fix GCN and eliminate that front end geometry bottleneck and let their cards stretch their legs in games if they ever want to have a shot at gaining mindshare back.
TeraScale 3, Polaris and the first iterations of GCN were either dominating or incredibly competitive and they didn't brought them back enough mindshare, the enthusiast/gaming desktop market has been fucked for years now, if just isn't worth investing in it
Even Vega was competitive at launch and at MSRP, people gave no fucks

>Lead developer was let go quietly shortly after release of the product
Gee, I wonder why that happened.

>letting go people to bigger companies where they get entire divisions out of thin air
sure senpai, exactly the same as letting go

What the fuck makes you think they are going to pack four of those power hungry bitches in one fucking die?

Gotta compete with i9s

Pajeet lied to AMD and because AMD didn't want to give refunds for GPU they let Pajeet go.
Because there wasn't outcry from AMD, Intel though he wasn't a garbage human bean.

There's a reason Raja bet so hard on HBM. He expected it to do all the work so he could sit back and do nothing for 4 years

>Because there wasn't outcry from AMD
You think you're smarter than Intel management? Well, tbqh I can't blame you with their current state
But anyone would had thought the same as you, even the utmost brainlet in Intel management, even then they still gave him an entire division and the freedom to turn an HPC project into a consumer GPU
Are you retarded? Memory bandwidth has barely any influence on geometry throughput

You're thinking of HBCC. And him shilling "fine wine".

again, we do not know HOW it was fucked
are primitive shaders in the gpu but they couldn't make the code transparent so no backwards compatible shaders, or were primitive shaders fucked from a hardware level where no amount of code was getting that shit to work?

a hardware level we got a new gpu to possibly make it work, a software level and they either need more time or more talent to get it to work and even then possibly not going backwards.

were the shaders even in farcry 5 as it was a game that should have been made to amd apis?

so much is a cluster fuck and we don't know where to start.

And when that huge ass power hungry die is released by Intel at 2000 dollars while Nvidia points and laughs, what's going to happen next?

Moving the goalposts my senpai?

What's the definition of vapor ware?

It doesn't matter how it's fucked up, the fact that it is existing at all is a fundamental flaw in GPU design. It doesn't matter how forward thinking it is. And game developer buy in is never going to happen. Ever.

They could also chop navi down so its at the parity point of what the hardware can do, making the chip itself smaller, and effectively given us a higher performance at a lower cost.

as it stands, if it was just a clock up, I would take a navi that was cut down to the parity point as long as it has 8 gb of ram.
There are gpu effects I dont like, and I typically turn the settings off or down, and benchmark wise, 56 scaled up would put me at around 60fps for 4k.

No. It's pointing out he is the most over rated snake oil salesman in the industry.

Yeah, pointing out with things that haven't happened yet, against his proven track record at ArtX/ATi/Apple

You're almost as good as raja at designing GPUs.

Are you just not understanding the point i'm making?
here let me point it out so you can see
WE DO NOT KNOW IF IT WAS FUCKED IN HARDWARE OR SOFTWARE
HARDWARE OR SOFTWARE
this means WE DO NOT KNOW THE IMPLICATIONS OF SAID SHADER
If it was fucked on hardware, LIKE FARCRY 5 THAT SHOULD HAVE HAD IT BUT PROPORTIONALLY DOES NOT BENEFIT FROM IT then its a hardware fix going forward
this also has the implication of NO MATTER WHAT THEY WERE DOING IT WAS FUCKED ON HARDWARE SO A SOFTWARE PATCH WOULD HAVE NEVER WORKED
Its entirely possible that once fixed in future hardware revisions, if this is the case, it will work on all games

IF IT WAS FUCKED ON SOFTWARE THEY MAY TAKE A DIFFERENT APPROACH GOING FORWARD THAT IS API AGONOSTIC

There are tons of things we don't know in the approach they need to take, I honestly trust amds hardware side far more then software at this point despite FAR preferring amds drivers to nvidias.

LOL. You mean his shit work that was saved by Jim Keller and Apples OpenCL developers to make his shit functioning? And what did he do at ATI? Make a decent chip in the early aughts. Fantastic.

He is a snake oil salesman who tanked RTG.

If the problem is software, I would release hardware with hopes of the software catching up by release
If the problem is hardware, I would release a cut down gpu to waste as little space as possible, clock it to hell and back, and throw it at the 2000 series because honestly, 800-1200$ is a fucking joke, If I put out a gpu at powerfull as a 1080ti Which with just a clock boost from 7nm is possible, and priced it in the 200-300$ range, which from the 14-7nm shrink may be possible, you have a user base who will use your hardware.

It doesn't matter if it's fucked in hardware or software. It's fucked period. That's what you're not getting.

If it's not functioning it might as well not exist.

>But it exists in CAD!
Fantastic. It works in a specific program so that it's not a total pile of shit in that one program.

in 5-10 years when enough disgruntled people leave amds gpu devision we will get the full details on what went down.

and yet again you don't understand the implications of either one, congratulations, you're retarded.

Kyle Bennette already detailed what went down.
From ATI to AMD to ATI? A journey to in futility.

Ignore all the shit he says about Su who is obviously the most competent executive in the semiconductor industry next to Huang. And read about how much of a power tripping bitch Koduri is. How he felt so entitled about having a job waiting for him at Intel. How he kept publicly bitching about not being able to attract engineers to AMD.

If this were a sports team, he would be the diva player airing teams dirty laundry because he wasn't getting the ball enough and when he was given the ball would do absolutely fuck all with it.

Actually, that RTG managed to successfully implement primitive shaders for some professional applications suggests pretty strongly that the problem is a software issue, not a hardware issue.

And actually, given what primitive shaders were supposed to do in the first place, the issue should by definition have to be a software issue. Particularly that they never wrote the software to implement it in an API agnostic way in games.

Thanks for being the mongoloid inbred fuck you are. For trying to assign the importance of hardware or software failure in a post mortem about a pile of shit chip designed by a snake oil salesman on specific features that don't exist outside of specific use cases.

>Vega was either a lie or broken on day 1

Hmm I wonder how that could've happened.

Attached: radeon vegana.jpg (1280x450, 26K)

show bobs and vegana 64 sir

>Yes, hello sirs
>I am today presenting proud the Radeon RX Vegan
>We dutifully have supporting open source community
>With RX Vegana all proprietary blobs will be extinct as moving forward we have implementing revolution Radeon RX OpenBobs technology.
>Don't settle for proprietary blobs, ask for OpenBobs

>Here, take this over volted pile of consumer shit.

Attached: polaris_unveiled.jpg (600x450, 91K)

>Jim Keller and Apples OpenCL developers to make his shit functioning?
[citation needed]
Also, nice way to show your little understanding on software and hardware
>And what did he do at ATI?
TeraScale was great dumb newfag
Keep being retarded and pretending it doesn't exist when it clearly exists, because
>muh gaems

I'm sorry if you are underage as well as a mongoloid and haven't been following the industry as long as I have.

And also thank you for not understanding that it took a combined hardware solution and software solution to make his shit work.

what he wrote was poo wanted to spin rtg off of amd and that's about it, or is there a different article?

It details how Raja viewed himself and why he tanked RTG.

I don't think we know anything. At least I haven't heard any technical details. If I missed something could someone link me to the information? There seems to be a lot of shitflinging about what Navi will and won't do as usual but I'll just wait until we actually get some information.

You can't fix fundamentally broken hardware with software, subhuman faggot
And TeraScale was beating Nvidia most of the time, and also brought a lot of innovations back then, keep pretending you have followed the industry while it's obvious you started shitposting back around Hawaii launch
Kyle had literally zero sources, zero understanding of AMDs corporate structure (it's simply impossible to sell RTG), and time proved him wrong

>Terascale garbage beating Nvidia

NOPE.AVI

Not to mention the fact it was abandoned almost immediately while Nvidia GPUs got long support

anandtech.com/show/5775/amd-hd-2000-hd-3000-hd-4000-gpus-being-moved-to-legacy-status-in-may

anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy

Holy fuck... Thanks for being illiterate subhuman garbage and wasting my time.

Jesus. What a great engineer. Lol

>NOPE.AVI
Yes, dumb nigger, TeraScale was way better at gaming while sacrificing GPGPU performance
>Not to mention the fact it was abandoned almost immediately while Nvidia GPUs got long support
You know they moved then to legacy 4+ years after they launched, right? Back when AMD was cash strapped and their shares at 2, of course they couldn't keep supporting them
Thanks for proving you know jack shit about GPUs

They moved them to legacy because maintaining a separate VLIW5/4 driver stack is just expensive.

Yeah, AMD just didn't had the resources to keep supporting them

>NOPE.AVI
Yes.avi

Terascale GPUs are garbage and don't even get WDDM 2.3 or OpenGL 4.6 like Fermi

Stay mad, faggot that knows jack shit about GPUs

Yeah, nice argument there faggot

>Terascale GPUs are garbage
Objectively wrong, the HD5870 was a monster and obliterated Nvidia on all fronts, outside of HD2xxx and HD3xxx all of the TeraScale cards were great
>don't even get WDDM 2.3 or OpenGL 4.6 like Fermi
They couldn't due to the lack of FP64 support, which blocks OpenGL 4.5 and 4.6, which is being implemented on MESA through emulation
FP64 was just useless back then for ATi/AMD, zero use on gaems and ATi pretty much abandoned their GPGPU efforts early in the game
Fermi didn't get Vulkan like Nvidia promised, and got DX12 years later than what Nvidia promised after backlash from the community

>Not to mention the fact it was abandoned almost immediately while Nvidia GPUs got long support

It was abandoned by then money-bleeding AMD to cut back on expenses. For their time TeraScale cards were pretty good.

>losing sleep over my monitor upgrade
Dude any high end GPU running games with triple buffering vsync will look fine. Why do you even need G sync?

I want a video of Ninja playing on a upper mid tier setup (120Hz+ with regular TB v sync) vs a fully loaded setup (with G Sync and all the bells and whistles) just so I can put all this BS to rest. I guarantee even he wouldn't notice despite his stupid fast reflexes.

Never run gsync or freesync when running high fps games. You don’t want the input lag that comes with it. The only reason for the sync technology is when your games run below your refresh rate consistently.

Did you even look at the specs instead of the upvotes?

VSync input lag is very noticeable in certain games. It's mostly in FPS, it feels like garbage. Like you're dragging the camera through molasses.

>If not, is there something missing in gfx9 hardware? Are you allowed to say?
>GFX9 will not be supported.
This is pretty puzzling and/or damning. He could have said that it's not supported in hardware and probably would have if that was the case. Instead it's just screw you early adopters, you can't have this feature, we're not about to let you have access to this technology.

I glanced briefly at the stubs in the code and it really does look like they were planning on implementing quite a few features and later changed their mind for some reason.

>reddit spacing
Fuck you, I'm not reading that shit.

>I glanced briefly at the stubs in the code and it really does look like they were planning on implementing quite a few features and later changed their mind for some reason.
It doesn't even seem like Vega had any spins.
Looks like pajeet got butthurt over Lisa micromanaging his teams for semi-custom sake, so he left and then Lisa canned everything Vega, just like she did with K12/Skybridge.
Good shit Jensen decided to shotgun himself on stage for the sake of his niece.

>Vega was either a lie or broken on day 1.

Por que no los dos?
How's that fine wine primitive shader going? Is it vinegar yet?