What lessons could AMD learn from Radeon VII?

What lessons could AMD learn from Radeon VII?

Attached: AMD-Radeon-VII-1--1-pcgh.jpg (1920x1080, 220K)

Other urls found in this thread:

wccftech.com/amd-rx-vega-64-outperforms-nvidia-rtx-2080-by-14-in-vulkan-enabled-world-war-z/
twitter.com/NSFWRedditVideo

wccftech.com/amd-rx-vega-64-outperforms-nvidia-rtx-2080-by-14-in-vulkan-enabled-world-war-z/

Stop wasting money designing better cards, and spend more money marketing your products to developers, and beating nVidia at the dev studio, not in the desktop.

This 100%

This, when you see an Nvidia logo pop up for a game, you're being told optimization was focused for Nvidia in mind, and will by default not have the same level of support/optimization for AMD cards.

While AMD cards technically may not be as good as Nvidia, the lack of optimization for AMD makes the gap look bigger than it really is.

The tide is turning. 7nm HPC is a real deal and by using Zen2 and Navi no next generation consoles I just hope AMD gets closer with development teams so the PC version of games gets the actual performance that those hardware are capable of.

Just wait till HBM4 meets 3.5nm

IIRC the biggest thing that cucked consoles this generation were the utterly shit CPUs they both had. Literally bottlenecked the GPUs not even half a year into their cycle. Hopefully this time the CPU/GPUs in them will be able to take each other to their peak performances.

Nothing compare to the gen before

Attached: ppe.png (817x1372, 120K)

Make it in Israƫl.

None, it was a paper release and a way to reuse professional dies.
Worked exactly as it was supposed to.

Work on being optimized for games. A VII will absolutely beat a 2080, maybe even a 2080ti in raw power statistics, but most of it goes to waste when a game isn't optimized for it.

Attached: 34342333.png (982x295, 18K)

You realize Cell was designed by Lisa's team? Why would you betray mommy AMD like this?

>a single garbage game that's like L4D2, but literally awful is the defining factor

Stick to iGPUs before intel kicks them out of that market like Nvidia did to AMD

This and stop dicking around and make vulkan more usable to game devs. The higher FP32 number crunching on vega is absolutely devastating.

Attached: World-War-Z-1920x1080-Vulkan.png (805x935, 61K)

That's something they were already doing back when they had the resources to do so.

Attached: serveimage.jpg (259x194, 6K)

Get proper drivers for reviewers before release
Find the sweetspot between performance and power and stick to it, power-users can overclock if they want

Do people who write this shit think that the engineers didn't know all this? Man are some people ignorant/arrogant.
CELL and PS3 development got fucked mid flight thanks to external factors, PS3 was actually supposed to have two CELLs, one working as CPU and the other as GPU.
Second, most developers, specially ones who never even developed for the thing, had no idea to even utilize the Cell itself, even with its shortcoming, good developers managed to push out things like The Last of Us or Uncharted. Even when it wasn't as intuitive as it was supposed to be, in terms of raw power, it was still massive for certain workloads that didn't get hindered by all the shortcomings.

run more batches of 5000 cards with huge margins to sell out instantly.

Having a CELL work as a GPU sounds like exactly the kind of awful idea to arise from the mid 2000s. Just like programmable SIMD add-in boards made to complement GPUs, like the physx cards

PS3 was never supposed to have two Cell chips. It was always advertised as one and then RSX was slapped on.

Both the 360 and PS3 had absolute shit CPUs that had no business being in a gaming console. Literally the reason we have the phrase "real world performance" to talk about CPUs today is those two consoles. Both companies tried to dig for "theoretical maximum performance" and in practice neither was very useful.

FYI both of the games you mentioned had longer development times and bigger budgets than most games are given because they are there to sell consoles first and make a profit second. We're also talking about a developer that had access to PS3 devkits from day one while other developers would have to wait at least a year to have substantial time to play with it.

Don't forget that both of those games don't do anything impressive either, just high detail characters close to the camera with a few dumb bad guys in scene at a time. I know of that gen a couple of supreme commander games launcher but I never bothered to see how they run.

>While AMD cards technically may not be as good as Nvidia
They might be as good as Nvidia with 7nm, HBM2, and their incredible compute performance. There's just literally no software optimized for AMD hardware. Especially with the amount of stuff that uses CUDA instead of OpenCL or DX11 instead of Vulkan.

normie plebs and tech reviewers don't appreciate stuff like 16 GB of HBM2 memory and a TB of bandwidth; they only like muh games performance despite that being a software and not a hardware issue

"Don't release a bad product."
It's so simple. I can't understand how they missed it.

dead Mi25
noisy
dedicated GPU foe dirt and vulkan
no thank

If you know your product isn't up to par and just a rebranded and gimped datacenter card, at least add a limited version of a datacenter card feature like SR-IOV.

Has AMD found a way to make them not just about catch fire or cause blackouts yet?

>design better cards

the cards are top notch they are way more advance than anything nvidia can offer

the problem is that amd doesnt have the software ecosystem to support them

>What lessons could AMD learn from Radeon VII?
maybe stop using so much flipping power

But other than that there is nothing wrong

300w is the same as a furyX
I think your thinking about an old card the Radeon R9 295 at 500w
yeah i think it time you stopped living under a rock mate

Do you understand that "optimize for games" actually means "rewrite the shader code that incompetent game developers wrote"?

And? Did its development being fucked make anything stated in the post less true?
>ultilizing the cell
The post was referring to the PPE specifically.

there's nothing to learn from it. amd wasn't even planing to release this, but nvidia's offering is so weak that they actually did. amd just noticed that they can rebrand an instinct and provide competition for the 2080. amd has nothing to lose, the card was already built for a different purpose.

based

It is a stopgap that is far more suited for compute loads.
And even then it still fairs fine against the equally priced 2080.

wew, FPBP

This