AV1 DECODER DAV1D 0.2 IS OUT! IT JUST KEEPS GETTING FASTER!!

>dav1d is really ready for production,
>dav1d has impressive benchmarks on ARM devices,
>dav1d is now fast on 32-bit desktop processors (SSSE3).

jbkempf.com/blog/post/2019/dav1d-shifts-up-a-gear-0.2-is-out!

Attached: 1200px-AV1-logo.png (1200x554, 56K)

Other urls found in this thread:

reddit.com/r/AV1/comments/ah6x11/3_weeks_to_encode_a_short_5min_ultrahd_4k_video/
streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=130284
stream.twitch.tv/encoding/
blog.twitch.tv/how-does-vp9-deliver-value-for-twitchs-esports-live-streaming-35db26f6322f
forum.doom9.org/showpost.php?p=1868150&postcount=1531
twitter.com/SFWRedditVideos

> (((david)))

>t. Goliath

kek

>jewish fairy tales

Attached: it-was-real-in-my-mind.jpg (480x360, 12K)

nice

How long before you can play 1080p 30 fps on Broadwell or more recent laptop without frame dropping or laptop going into housefire mode?

>(((HEVC)))

>How long before you can play 1080p 30 fps on Broadwell or more recent laptop without frame dropping or laptop going into housefire mode?
That was possible with dav1d 0.1 you pleb.

I do that on my laptop with only 5% CPU on 0.2.1

What about encoding though? As long as this isn't an encoding option for streamers, nobody gives a shit. I don't see AV1 encoders on any GPU encoders either.

I just googled and found this. This has to be a joke?

reddit.com/r/AV1/comments/ah6x11/3_weeks_to_encode_a_short_5min_ultrahd_4k_video/

>What about encoding though?
It will be at x265 speed or better at some point. It just needs time to mature. Stop being one of those retards that thinks that just because it was slow last year means its going to be slow forever. They already made a huge increase a month ago and they have AMD, ARM, Intel, and Nvidia backing the codec as we speak.

streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=130284

So yeah, quit being a fucking retard on Jow Forums

remind me again when it's faster than HEVC Nvenc please

>at some point
into the trash

>Stop being one of those retards that thinks that just because it was slow last year means its going to be slow forever.
Yeah, just like how Google spent all that time improving the vp9 encoder instead of just throwing massive amounts of datacenters at it.

zoomer retard who can't remember how slow x264 was when it came out

This is more than just google now and they gave up on VP9 to do VP10 which became part of AV1.

You people are morons.

And I'm sure they won't give up on AV1 to make AV2.

>imagine being an AV1 shill

>let me just encode a 5min 4K video single-threaded with the reference encoder 6 months after the bitstream was frozen
Yes, it is.

And hell, I'm absolutely certain they wouldn't drop support for "legacy" codecs like H264 and VP8 in Chrome so they can restrict media on the internet to behemoths with enough CPU time to throw at encoding.

VP9 only had one FOSS encoder until recently. We already have three for AV1 now. aomenc, rav1e and SVT-AV1.

listen. All I know is my turing GPU can encode to x264 and x265 quick af and the latest gen of noovidias encoder is actually on par with CPU fast presets. Would I welcome AV1? Yes. Does my GPU support it? No.

I was worrying about av1 but after testing youtube videos on my t410 with the intel gpu i didn't notice any issues with 1080p videos

Attached: 1549704717051.jpg (2900x4095, 782K)

Well, we can agree then that it's to soon to switch to AV1 encoding as a consumer. Someone should've told the guy, who spent 3 weeks encoding one video.

>I'm absolutely certain they wouldn't drop support for "legacy" codecs like H264 and VP8 in Chrome
Nobody says they would. It's always going to be supported until every piece of hardware has AV1 support and it becomes the most used codec like h264 is right now.

There's no reason to dump AV1 until it has exhausted all potential. The goal is to make all video encoding open source. AV2 will come when AV1 reaches its limit.

>Nobody says they would.
Why wouldn't Google want to strengthen their online near-monopoly on video content by killing everything except YouTube?
I'll believe your bullshit about AV1 when I see it.

>another retard who thinks AV1 will always be slow and that development doesn't improve speeds

Just look up the history of x264 and x265. Same shit except this is the only codec that is based on a fully open source standard so it gets way more attention from Jow Forums

>imagine being this retarded
You don't get my point now, do you? What the fuck am I supposed to do with your shilling information if I can't encode to AV1 yet?

>Why wouldn't Google want to strengthen their online near-monopoly on video content by killing everything except YouTube?
They already have a monopoly and they can't monopolize encoding. Idiot.

Go make some shitty x265 encodes, pajeet.

>What the fuck am I supposed to do with your shilling information if I can't encode to AV1 yet?
Nobody is saying you can encode to AV1. You have to wait for the encoders to mature. How old are you? Let me guess, too young to remember how slow current encoders were.

Are you seriously telling me I can stream to AV1?

What does this even mean?

>my turing GPU can encode to x264 and x265 quick af
>hardware encoding
if you don't give a flying fuck about quality or efficiency, why are you even in an AV1 thread?

Depends on the resolution. With my Ryzen 7 2700x and SVT-AV1's currently fastest speed setting, I'm just shy of realtime encoding for 720p (avg. 21 fps).

AV1 = AVI

Think about it.

How much better would dedicated hardware be, in comparison against improved software?

stop. Not funny; not interesting.

What's the difference between them?

Is there any change of an NVENC version of the AV1 encoder ?
If it's a smaller filesize than HEVC I would love to reincode my footage that I post on youtube into that format instead (if it saves me more time uploading vs reincoding that is).
The source I'm using is a 4:4:4 lossless h264 so it has as good a source to reincode from as it gets.

aomenc is the reference encoder provided with libaom. They do work on improving its performance, but pure encoding speed will never be its strength. Right now it provides the best possible quality though.
rav1e is Mozilla's AV1 encoder written in Rust. They achieved 5 fps for 480p around last October or November (I think). Until recently it was the fastest AV1 encoder, but given the same encoding speed it produces a slightly worse quality than aomenc (at least it did last I tested it at the end of November).
SVT-AV1 is one of the new Intel encoders (next to SVT-HEVC and SVT-VP9) and currently the fastest AV1 encoder. Very interesting and active project, but the devs also have a lot of work ahead of them. It's basically in its alpha and still very unstable. Interestingly enough they already provide an ffmpeg plugin unlike rav1e.

Cool now what about fast encoder so we get something to decdode?

Decoders are one thing.
Encoders, though...

Both are improving, user

don't care until anime is encoded in av1

Why do you people make a big deal out of decoders? I recally there was huge hooray over FFVP9 too.
Decoders are boring, mandatory thing like having a AC to PSu cable for your PC.

Wake me up when AV1 has usable encoders (it doesn't) and can actually be used for anything. Freetards give these things way too much slack just because muh ope fight the eval powah delusions.
I want real improvements in quality and compression, no "good enough already" "but its open" "people can't afford proprietary codecs" BS.

Attached: 0012.png (671x489, 92K)

update the fedora package to 2.0 reeeeee

None of them is ready or good for anything serious.
Good encoders don't happen in half a year, you need 3-4. Also, for libvpx (so libaom too I fear), substitute 3-74 with never. VP8 or VP9 encoding in google's lib never became good.

>so it gets way more attention from Jow Forums
more like, way more hype and undeserved goodwill

>Decoders are boring, mandatory thing
The existence of one is mandatory, the existence of a properly optimized one isn't. Sure, hardware decoding is ultimately more important for mobile devices, but even on desktops you'll notice a big difference between decoding with libaom and dav1d.
>Freetards give these things way too much slack just because muh ope fight the eval powah delusions.
True, but it's also illusory to believe any new codec will be good 9 months after its bitstream got frozen.

I'm not denying that it still requires a lot of work. However, nobody really gave a shit about VP8 and VP9 which is why we only got libvpx and EVE (if you didn't mind paying for a fucking VP9 encoder) for the last few years. The existence of several FOSS alternatives is a good first sign.

Thready reminder that AV1's handling of analog noise is defective by design and that it will never be useful for ripping film/tape/etc sourced videos.

>The existence of one is mandatory
Yeah, it's homework, not some grand victory.
FFmpeg's HEVC software decoder never really got stellar and we live though - though libaom/libvpx was quite bad with lack of threading, it would still somehow work.

Here though, it is also lack of much content besides youtube demos, and the lack of mature encoders. We pretty much have 2 years of time before AV1 playback supports becomes really needed and this it doesn't really matter whether we'll have a decently fast decoder in early, late 2019 or 2020.

wow, that'S a lots of edit fuckups and types. Sorry. Next time I won't post when working and eating.

Where they denoise, compress, and apply algorithmic noise after the face?

Doesn't sound too dumb to me actually, as long as the denoising filter used isn't too strong.

>as long as the denoising filter used isn't too strong
Yeah that is going to be a huge problem, denoisers will eat detail if you want them to subtract all the noise.

The second problem sis to actually parametrise the noise you remove like this. It has not been solved and the spec doesn't give any hints. When they added the feature to the spec, it was tested manually with noise parameters injected by hand. And I doubt they really managed to get correct similarity anyway.

It's a potentially good thing for lower-bitrate encoding but fraught with problems. After all, AVC and HEVC also had this but nobody managed to use it.

It sure isn't google-optimal. I recall it wasn't that easy to google up whether there are some news about it, back then.

Incidentally, how does it look with the compatibility-breaking AV1.1.0 revision, anybody knows?

Attached: sophie questions.jpg (401x500, 33K)

>the other is called rav1e
Ok

I just find retard since we've three major encoder projects, dav1d, rav1e and libaom when in reality we'll be using just one.

Dav1d is just decoder (its author will make a closed, not public encoder for Netflix and such pigs), Googlelibs were always crap encoders and libaom is built on that exact epic-cruft codebase that'S completely unsavageable.

So Rav1e is pretty much the only hope. Success not guaranteed either, seems the people mostly have either FFmpeg or Thora experience, so it's more like their first adult encoder.

They also complicated their work by using Rust instead of C or C++.

Ironically Rust was not used for the decoder where it would make more sense, since it would be prone to malformed-file/stream attacks.

Attached: 1483798966731.jpg (569x715, 242K)

So we need an autistic weeb like x264 had to make things work?

Possibly. But x264 (AVC) also had less tools to tune and reign in, so it might be a much thougher task to make AV1 encode with great visual quality (no smoothing, blocking/banding etc).

So when can we expect hardware support for AV1?
I'm tempted just to use HEVC because it has hardware encoding support now.

2020

What on(in)?

>blocking/banding
Isn't this "fixed" by using 10-bit and higher bitrate?
I thought this was the real reason why Main10 is the industry standard (for HEVC) and 1billion colors was just marketing meme because crowd in general doesn't want to know technical details.

HEVC also has usable encoder, there is no reason to use anything else than that and x264, until AV1 encoders are actually good.

it is somewhat mitigated, but only to a degree, when your source is 8bit.
And even if you won't actually see the ugly tone borders, there is still the underlaying problem which is detail and texture smoothing and removing.

>So when can we expect hardware support for AV1?
Hardware manufactures that are AOM members, like AMD/NVIDIA assumed the responsibility of releasing (commercially available) the first hardware decoder capable device within 1 year of official release. I bet AMD is going to do with Navi very soon, may or may not be in it's entire line up tho.
I actually don't know how things work for ARM since they don't release products, but it should be on Snap 865 next year, as 855 doesn't come with. Android Q has the support already. So this is at least confirmed. This is why I think it's probably going to be widely available in less than 5 years.
>first year - high end
>2-3 years - mid-range
>4-5 years - every consumer that buys mid range should've replace their devices by this

Twitch.tv chose vp9 for real-time encoding over x265.

>there is still the underlaying problem which is detail and texture smoothing and removing.
Oh, now I understand what you said by smoothing.
But I think they're going to ignore this for quite some time, as major players should "rule" and their audience consume on smartphone/tvs, the first one having high-dpi possible "hiding" fine-grain details and modern TVs all use default filter to reduce noise/grain, which makes me think they're won't bother with this too much. Since the final image that people will see are all "fixed" in the end.

Were? Every stream I watch is h264.

>Hardware manufactures that are AOM members, like AMD/NVIDIA assumed the responsibility of releasing (commercially available) the first hardware decoder capable device within 1 year of official release.
I have no idea where you have that from. Google's roadmap shows H2 2020 for first silicon - and those will be Android ARM processors.

Also, that was before this SNAFU with the AV1.1.0 happened. It is quite possible this revision moved the implementation father into the future.

stream.twitch.tv/encoding/
only 264 here

I hear Twitch uses FPGA or something. It screamed batshit crazy idea that will simply crash and be silently burried at some point, who knows if it even works.

Oh, yeah.
blog.twitch.tv/how-does-vp9-deliver-value-for-twitchs-esports-live-streaming-35db26f6322f

Attached: Screenshot_2019-03-14 How VP9 delivers value for Twitch’s esports live streaming.png (638x400, 24K)

blog.twitch.tv/how-does-vp9-deliver-value-for-twitchs-esports-live-streaming-35db26f6322f

I've heard some Japanese company even sells fpga encoders for av1 of all things.

startups peddle a lot of snake oil, and not just startups

It really is hard for me to believe that people don't understand this. Like if you want to say you're not familiar with video codecs specifically that doesn't even help you since fucking everything to do with encoding and decoding in software works this way
>hashing something
>encrypting it
>compressing it
>on image, audio, and generic data
Until people figure out optimal paths and then go extreme and hand-rewrite parts in ASM for every platform that's just how it is.

Speed isn't your main problem. Getting the encoder do the right decissions to give good quality, that is the HARD part. And that is where the 3-4 years that are needed come from.

>based and weeb-pilled

>still using 264 for encoding like a bunch of luddites

I'd say luddites encode in inferior encoders that give shit quality but "it's completely libre and unencumbered!!!one"
x264 still beats AV1 encoders if you go for high quality.

>Why do you people make a big deal out of decoders? I recally there was huge hooray over FFVP9 too.
Fast decoders mean higher resolutions becoming playable, lower power consumption, lower CPU usage, etc.

Having to max all your cores to play a 1080p or 4k video on a desktop is not nice, which is still the case with libaom but not with dav1d.

>So Rav1e is pretty much the only hope. Success not guaranteed either, seems the people mostly have either FFmpeg or Thora experience
It's mostly Xiph, with one or two ffmpeg/libav devs. So basically, Theora/Daala experience.

You think companies like Netflix, Twitch and Amazon chose vp9/av1 over proprietary codecs because they believe in muh free software ideology?

>remind me when software is faster than dedicated hardware
Are you fucking retarded?

Legit question: is there a single firsthand content publisher (not release / re-release group) that uses AV1? Or HEVC for that matter?

Attached: hevc_logo-1ea277eafbc26d1976d5724eb7ab1e51fe1fdcd4060578a88be010885dc6ef64.png (300x191, 4K)

They want to save a petty penny. For Google, it's also about NIH, lock-in and trying to control most of the internet.

my comment was not about megacorps but about end users that like to fanboy over free codecs, despite having no benefits from them

The main problem with your argument is that those companies don't give any damn about quality or user value. That's why Google was able to field VP9 right after it was finished despite gross video quality the encoder produced. Or users having trouble playing it due to their shit non-multithreading decoder.

Youtube does experimental AV1 encodes on some videos, does that count? 4K blu ray uses HEVC, also.

How would you explain that wmaf (which wasn't made by google) shows that in tests vp9 and av1 are on par or better than their counterparts? Maybe encoding is slower but quality-wise they're pretty decent.
As for the average user, I agree that caring about encoders is highly autistic but it's important to have a free alternative. Monopolies tend to be bad for everyone. Besides, it's not like Youtube is forcing you to use vp9 anywhere else. You can even watch it in 264 if you really want.

2021

I see Amazon webrips that use HEVC sometimes.

264 just werks, senpai. Barely any anime has UHD Blu-ray sources, and if all else fails just get a remux or jack up the bitrate (10-20 Mb/s) until there's an imperceptible difference between it and the newer codecs.

next year

>How would you explain that wmaf (which wasn't made by google) shows that in tests vp9 and av1 are on par or better than their counterparts?
Because such benchmarks usually compare codecs for low to moderate bitrates. That's where each new codec beats the old ones. High bitrates or even lossless compression rarely get tested.

this

>try latest vlc
>using that one anime AV1 encode on nyaa (F/SN Heaven's Feel 1080p 10-bit)
>still eats complete ass on the scene at 25:00
>aom is fine

VMAF you mean? It's just metric, it's not visual testing. And metric usually means crap.

Note that at least AV1 and possibly VP9 have been tuned to give better VMAF numbers, x265 not and x264 definitely not, VMAF didn'T exist in its active development era.

I don't know what they exactly want to achieve with VMAF but it didn'T result in the encoders being able to produce transparent quality at higher bitrates. Seems it's as always, the new format only have a window of opportunity in shit and lot bitrates where everybody suffers but newere formats have some edge because older ones fall appart sooner. As long as you give nough bits, it is shown that new encoders can't use them effectively and produce subpar quality.

For example: forum.doom9.org/showpost.php?p=1868150&postcount=1531
That's AV1 loosing to x264 completely at the its slowest setting, at uber bitrate at 14 megabits for not even 1080p resolution.
Not fault of format, fault of shitty encoders. VMAF and friends tell you that the format and/or encoders are oh so superior. BS. Format might be, but you gotta wait 3 years and see if they manage to do anything useful with the tools.

No

fpgas make sense though because you're just converting the algorithm to logic gates. It'd be faster than any homo SSE vector optimizations you could make, and it's not like ASICs that become useless one-offs because the logic gates are burned in and can never be updated to use better encoders for better formats

>Isn't this "fixed" by using 10-bit and higher bitrate?
lolno, you actually believed that shit? banding can only be fixed by using debanding filters, which also kills details. the hi10p meme is the homeopathy of video encoding.

>But I think they're going to ignore this for quite some time, as major players should "rule" and their audience consume on smartphone/tvs, the first one having high-dpi possible "hiding" fine-grain details and modern TVs all use default filter to reduce noise/grain, which makes me think they're won't bother with this too much. Since the final image that people will see are all "fixed" in the end.
Could you rewrite this post in english?