Async compute and primitive culling is enabled on linux

This patch series uses async compute to do primitive culling before
the vertex shader. It significantly improves performance for applications
that use a lot of geometry that is invisible because primitives don't
intersect sample points or there are a lot of back faces, etc.

It passes 99.9999% of all tests (GL CTS, dEQP, piglit) and is 100% stable.
It supports all chips all the way from Sea Islands to Radeon VII.
lists.freedesktop.org/archives/mesa-dev/2019-February/215085.html

Attached: prim-discard-cs-results.png (600x500, 8K)

Other urls found in this thread:

my.mixtape.moe/xujcaa.tar
developer.valvesoftware.com/wiki/Visibility_optimization
twitter.com/AnonBabble

So is this possible to do with actual games?
Why haven't amd just done so on windows, then?

CULLING is finally HERE?

Attached: 1297362536506.jpg (479x360, 11K)

Does this imply better performance on linux than windows?

sounds like he hasn't done a lot of testing with it

Attached: ss+(2019-02-14+at+12.43.35).png (613x627, 37K)

>Eventually we might also enable this on consumer graphics cards for those
games that benefit. It might decrease performance if there is not enough
invisible geometry.

That benchmark is cherrypicked to make a point. In games it'll make little difference. And in decently optimized games, none. (Well coded games hide geometry that is out of view)

>Well coded games hide geometry that is out of view

I would like to let you know that lots of modern games are very poorly made, especially the FLAVOUR OF THE MONTH BATTLE ROYALES

Those will often use standard engines though (unreal)... And they're not benchmarked for reviews either

When you've got basic fortnite-level graphics, spending 10% of your budget culling unseen geometry is not financially viable.

>loonix is suddenly better at games performance than windows
REEEEEEEE

>decently geometry culling optimized games
There are none

massive if verified
keep me posted

That'd be not enough to turn the tide, but a big win nevertheless.

anyone itt tried it yet?

Since years by now. Problem is, nobody develops games for Linux.

>Jow Forums
>actually using linux

huh? i'm cloning it right now, i just have slow internet

But the patch looks to be driver level, so if it actually works as theorized, it would work on even older games, automagically.

Windows compatibility when

Attached: 1297701481078.jpg (443x326, 28K)

When AMD will decide to publish it.

right after Bloodborne will be released on PC

Various types of geometry culling are big standard features in a modern game engine. There may be some hand tuning required but usually theres some form of geometry culling going on already

>3x the speed of a 1080 ti
wew lad
also, fury above vega, what?

above Vega56, yes. more shaders/CU's

bumbing for any game benchmarks

this

tried a couple games, and i couldn't measure a difference
i'd love to see if it makes a difference in solidworks... but it doesn't have a linux version

what games did you try

OpenGL games?

unigine-heaven (not a game, but eh)
left4dead 2
7 days to die

yes
i'd try things in wine, but i can't be bothered building the 32bit version

Crysis 2 would be super interesting since Nvidia paid the developers to put huge swaths of tessellated water under each map.

yea, well i don't have many modern games, and my internet connection isn't fast enough to grab one while this thread is alive
i'll upload a binary arch package of it in a moment if someone else wants to test it

How does this amdtard myth persist in 2019? That water is how crytek's level creation tool works out of the box. The default terrain is a small island surrounded by water, and terrain is controlled by a height map that you simply raise above the water. And the only way to even see it in crysis 2 is to enable a debug render mode so its not even clear if any instructions are given to the gpu to process the vertices for it under normal circumstances.

More importantly nvidia didnt do jack shit for the base game. They sponsored the DX11 patch that came months later, the water was always there to begin with

They advertised this a year ago when Vega came out. What took them so long?

>How does this amdtard myth persist in 2019?
Because turning down tessellation in AMD drivers will literally double your framerate in this game?
>More importantly nvidia didnt do jack shit for the base game. They sponsored the DX11 patch that came months later, the water was always there to begin with
Geez, I wonder why a Nvidia sponsored patch would completely overuse features for no visual benefit that cause performance penalties on competition.

The fact that nvidia fanboys defend this shit still makes me sick.

Attached: barrier-dx11-mesh[1].jpg (620x600, 69K)

>implying it was just the water

Attached: crysis2-barrier-mesh.jpg (620x800, 352K)

Tessellation is already driver optimized you retard, and AMD had the feature to manually set tessellation levels in their drivers by the time the patch came out. Maybe they should have done a better job optimizing tessellation like nvidia did, or get hardware that actually works

>muh immersion barriers
fuck off nvidiot shill

Imagine if AMD helped with a title that swapped textures around and required a minimum of 800 GB/s for no reason at all other than to say their hardware is better than Nvidia.

>Geez, I wonder why a Nvidia sponsored patch would completely overuse features for no visual benefit that cause performance penalties on competition.

Nvidia sponsored and AMD approved lmao. For a company that was riding the coattails of nvidia's investments it sure it funny their fanboys throw a hissyfit when its revealed how shit GCN always was. AMD didnt lift a finger other than to say the dx11 patch was a good thing until they had an opportunity to jump on the outrage bandwagon, meanwhile without nvidia the patch wouldnt have existed at all. The biggest example of a freeloading rat

They would still lose you retard. Even Pascal had much higher effective bandwidth than Vega, despite HBM2, because AMD's memory controllers have historically been shit. VII would get its asshole caved in by the 2080ti

>And in decently optimized games, none. (Well coded games hide geometry that is out of view)
>well optimized games carry out a function that AMD had to create custom hardware and custom drivers to actually do
Son, are you fucking retarded?

my.mixtape.moe/xujcaa.tar

its a fact that while nvidia and amd can both do tess at x64 in reality due to amd old ceo and his stupidity of not agreeing into culling amd simply had to bruteforce their way
nvidia on the other hand with their cullings its almost as if the game runs at x32

0% performance gain in games. AMDtards BTFO.

0% performance gains in windows games :^)

>well optimized games carry out a function that AMD had to create custom hardware and custom drivers to actually do
It's not fucking rocket science. It just takes some work. developer.valvesoftware.com/wiki/Visibility_optimization

yeah but plenty of games don't do it, esp. lazy ones made in game editors

>shit devs are holding back games from running as well as they could because they don't know how to code
I miss Carmack

Wintoddlers BTFO.

Attached: 1545421344529.gif (244x248, 113K)

NGREEDIAFAGS BTFO AHAHAHAHAHAHA WILL THEY EVER RECOVER

>primitive cucking is enabled on linux
It's been a while

Except the game engines already do it you tard. Notice how the damn festure wasnt even tested in games yet you retards are already making shit up.

THE WAIT IS ALMOST OVER