Dav1d 0.3.0 is out and it's faster on every platform! Fuck yeah AV1!

>The open-source AV1 decoder dav1d was updated yesterday to version 0.3.0. With the third release, new assembly code provides some serious performance gains on both the PC and mobile platforms.

>On the x86 side, this release mostly improves the SSSE3 performance of dav1d. Xuefeng Jiang contributed with prediction of chroma from luma and Paeth intra prediction functions, delivering 0,8% and 0,4% improved global performance.

>Liwei Wang continued his work on inverse transform with larger 8x32, 32x16 and 32x32 and up to 64x64 blocks, providing the largest speedup of this release, way over 10% on some video’s.

>dav1d 0.3.0 also introduces the first SSE4.1 assembly. In most cases the added SSE4.1 instructions aren’t useful in addition to SSSE3, but Victorien Le Couviour—Tuffet found a usecase where it was. He optimized the CDEF filter, resulting in a 1,15x speedup on the module level and around 1,5% overall.

>Meanwhile Henrik Gramner wrote some very clever SSE2 code to speed up entropy decoding/bitstream reading, which started to eat up a large proportion of decode time, especially on AVX2. The assembly code resulted in a speedup for all 64-bit x86 platforms, measured around 4% for AVX2 and 2% for SSSE3 and SSE4.1

>Overall these commits make dav1d 0.3.0 around 24% faster on SSSE3, 26% faster on SSE4.1 and 4% faster on AVX2 CPUs

>While single-threaded aomdec is still quite strong, with multiple threads dav1d 0.3.0 is making libaom an even smaller spot in the rear view mirror

>Martin Storsjö delivered two very nice commits speeding up the loopfilter and selfguided looprestoration with NEON assembly code. Both functions were speeded up by about 3x, resulting in performance gains anywhere from 7% to 36%. Not only allows this for higher resolutions, frame rates and bitrates, but also brings down power consumption on identical content.

medium.com/@ewoutterhoeven/dav1d-0-3-0-sailfish-armed-to-the-teeth-af5bbf845a16

Attached: 1*F9mKLWzXRJ5ct_9AwuJYjA.png (1600x449, 66K)

Other urls found in this thread:

engadget.com/2018/03/28/google-apple-intel-av1-netflix-amazon/
bitmovin.com/demos/av1
twitter.com/SFWRedditGifs

When do we get hardware codecs? 2021?

They are planned for 2020. Some people even believe at the end of 2019, but that's probably too optimistic.

>needs a data center botnet to encode 1 minute video
into the trash

My guess navi 20/ampere

Check out SVT-AV1. With modern CPUs you can already achieve realtime encoding for 720p footage.

If I remember correctly SVT-AV1 would need between 24-32GB of RAM for encoding

ramlet pls

dav1d is only a decoder, there are currently 3 (public) AV1 encoders:

aom reference encoder (Alliance for Open Media (mainly Google))
rav1e (Xiph.org, but very diverse contributer base)
SVT-AV1 (Intel, with a little Netflix)

All three are under heavy development. aom currently delivers the best quality but is very slow, SVT-AV1 scales very well over many cores and rav1e is a clean Rust implementation that currently delivers nice results on small systems but should also be very scalable in the future.

>between 24-32GB of RAM needed

Yeah, so?

Attached: YeahSo.png (1249x866, 42K)

It uses more RAM than other AV1 encoders and 24GB RAM usage can be achieved, but they are not the rule.
Resolution | 8-vCPU Commit Size (GB) | 40-vCPU Commit Size (GB)
------------------------------------------------------------------
4k | 14 | 24
1080p | 6 | 10
720p | 4 | 7
480p | 3 | 5

Yes, I'm aware of this, I did read the docs 'cause you know I'm like intelligent and not a basement feeding troll or something.

I just did a text enocde on the same hardware, about 5 fps on average, went from 1.44GB yuv to 13.2MB and it looks pretty much the same to my eyes so what do I care.

In time we'll see better compression and more power to do it, but for now this kind of stuff isn't worth the amount of actual power consumed to do it nor the time for 10 fucking seconds of video.

x264 is still the best fucking compression, seriously, it's perfect, it works, it's established, it's fine.

That's not as bad as I imagined RAM wise

H264 isn't an open/free codec though.

>Windows is still the best fucking OS, seriously, it's perfect, it works, it's established, it's fine.
>iPhone is still the best fucking phone, seriously, it's perfect, it works, it's established, it's fine.
>Facebook is still the best fucking social network, seriously, it's perfect, it works, it's established, it's fine.
>Youtube is still the best fucking video site, seriously, it's perfect, it works, it's established, it's fine.
Buy a Jow Forums pass btw.

>I did read the docs 'cause you know I'm like intelligent and not a basement feeding troll or something.
It does make you more intelligent than 90% of people, who are usually in these threads.
>x264 is still the best fucking compression, seriously, it's perfect, it works, it's established, it's fine.
True. I doubt any encoder will manage to come close to it, especially with the current trend to focus more on metrics than perceived quality. I just hope we'll get at least one AV1 encoder that focuses on psychovisual tuning.

Only hardcore freetards and companies care about whether a standard is free or patent encumbered.

>Only hardcore freetards and companies care about whether a standard is free or patent encumbered.
Everybody should care. The only people who profit from this shit are the MPEGLA.

x265 masterrace

fuck freetards

HEVC, user. You're talking about HEVC.
x265 is a FOSS HEVC encoder.

Reminder that major companies want AV1 to succeed

engadget.com/2018/03/28/google-apple-intel-av1-netflix-amazon/

Attached: AOM.png (854x8000, 3.47M)

This is one of those rare circumstances where the interests of large companies actually align with the best thing for consumers.

>Victorien LE COUVIOUR--TUFFET
Lmao quel nom ce mec

Attached: 1263232606995_Reaction_faces_part_1s350x345151481580.jpg (350x345, 33K)

>(((streaming))) media
>still looks worse than a Blu-ray
The resolution war is dead. 4K media adoption is turgid and most media will never be in 4K. HDR only matters if it's applied during the initial creation, and the vast majority of media will never get a proper HDR grading.

Oh yeah? Technology will never improve, huh? Ok, user.

first time i know about this, im a brainlet in video codecs

ok reading the thread, seems like it achieves better results than webm at the cost of needing more processing power

what else am i missing?, what did i misunderstood? how does it benchmark against webm in quality?

Attached: 1518358490219.jpg (699x485, 31K)

Video codec technology over the past nearly two decades has been about improving perceived quality for consumer media consumption while continously dropping the bandwidth requirements. Consumer audio-visual recording technology like smartphone cameras have focused on improvements in post-processing while continously reducing the size of the sensor or lens. There hasn't really been much quality improvements at the source; digital cameras are just now approaching the fidelity of 35mm and 70mm film stock. The way media is recorded hasn't resulted in much additional quality; we can just cram more of that quality into less space more intelligently at the expense of more and more processing power, and we're clearly hitting the point of diminishing returns with untouched Blu-ray on H.264 still as the definitive source for 1080p video (by all metrics) even with HEVC+ codecs and streaming on the horizon.

Is that right? Well maybe people should just stop developing. We've hit perfection, right?

Nice snark. Video codec development after H.264 has been mostly about reducing the bandwidth and storage requirements of the video, not improving perceived quality at the high-end, which goes hand in hand with content providers (all of which are on the AOM alliance) charging you for renting content to consume over a network instead of selling you a hard copy of it (physical media).

H.264 at Blu-ray bitrates: looks fantastic
HEVC at Blu-ray bitrates: doesn't look any better than H.264
AV1 at Blu-ray bitrates: doesn't look any better than HEVC

H.264 at say 5000 kbps: Looks pretty bad
HEVC at say 5000 kbps: Looks better than H.264
AV1 at say 5000 kbps: Looks better than HEVC, but not by much

H.264 at say 1000 kbps: Looks terrible
HEVC at say 1000 kbps: Looks bad, but better than H.264 definitely
AV1 at say 1000 kbps: Looks decent

It's all about improving the bottom line and reducing bandwidth costs when storage costs are cheaper and cheaper and bandwidth is becoming more and more plentiful.

few but not uncommon

google has people working in the rust project
and in debian maintenance;

also a core contributor to the apache project and RISCV

free and open source software translates to multimillionaire cuts in expenses to corporations

Oh, ok. So AV1 is bad for consumers. That sucks.

Bandwidth reductions are good for the consumer too, you know.
Many, many internet plans are still capped on data, which is especially a big deal for mobile.

Cont. Eventually the same thing will happen to video as audio. MP3 was superceded by Vorbis, AAC, and Opus - all of which are much more efficient codecs. However, literally no one cares since 320 kbps MP3 is trivial to stream and it's undistinguishable from any of its successors. So what if Opus is transparent at 100 kbps instead of MP3 at 320 kbps? Also, hard drives are large enough that storing lossless audio is quite practical, which makes obsessing over improvements in lossy codecs even more pointless, except for companies. Discord and Skype use Opus because it saves on bandwidth costs when facilitating voice chat at huge scale, but literally no consumer gives a fuck about anything other than MP3 or FLAC if they want lossless.

The scale of the problem is so different between video and audio, that it's a completely unfair comparison.
We've hit "max" quality for audio ages ago, so the only place to go is smaller.
However, video is still getting bigger and bigger. These kind of new codecs need to look at doing 4K very well and even ahead to 8K and larger. Eventually, there will be a "cap" to video resolution and frame rate, where any gains become so inconsequential that there is no point, but we haven't reached it yet, and after that, it'll also focus on making things smaller.

Basically, you have a shitty attitude and seem to think that progress and incremental improvements are bad. You also don't seem to have any perspective on anything other than your own and ultra-large corporations.
What about someone doing archival? Being able to store all of their 50TB collection in 40TB is a huge fucking saving.
And by lowering bandwidth requirements, it makes it easier for smaller players to get started in media distribution.

We haven't actually hit audio peak.
Atmos is throwing that to the wall.

source

I'm talking about actual encoding. Human hearing is already capped at ~20KHz, and we can already losslessly capture and reproduce sound up to that, so there are no improvements to be made there.

There are still interesting things you can do with audio and how it's used, and there can still be hardware improvements, but in terms of the "hard mathematics" of it, we've basically achieved perfection.

If you honestly think there's nothing to improve you're seriously dumb. It's not about being able to do it losslessly, we can always do it this way, it about how many bits we need to achieve it, and how we can be lossy while maintaining perceived quality.

Video isn't getting bigger and bigger. Technophiles like to think we need 4K and 8K, but the reality is we already hit diminishing returns hard with 1080p and 4K.

The jump from VHS to DVD was extremely noticeable and consumers welcomed it with open arms, with DVDs quickly outselling VHS.

The jump from DVD to Blu-ray is, imo, quite noticeable, but a significant proportion of the consumer population doesn't agree - DVDs still outsell or barely undersell Blu-rays. At 480p, if you can get around the inherent imperfections of MPEG-2 encoding like macroblocking and dot crawl, the peak MPEG-2 bitrate on disc of around 10000 kbps is actually pretty transparent, just inefficient by today's standards.

The jump from Blu-ray to UHD Blu-ray is barely noticeable compared to the other jumps, with the only benefit being HDR as the only defining factor that makes UHD Blu-ray "pop" compared to Blu-ray. 1080p HDR on Blu-ray disc is intentionally not done because companies know most consumers cannot see the increase in color space from BT. 709 to Rec. 2020 or any real benefit in going from Blu-ray to UHD Blu-ray if HDR was backported onto Blu-ray. UHD Blu-ray adoption is basically nonexistent compared to current Blu-ray adoption.

Streaming providers intentionally gimp 1080p video to make their 4K offerings look better. Just look at the bitrate and codec choices streaming providers like YouTube and Netflix use for 1080p vs 4K - 4K uses better codecs and massive bitrate compared to bitstarved 1080p still sitting in that 5000 kbps H.264 range.

I saw a test clip of an HEVC encoded 8K NHK broadcast rip that topped out at 80 Mbps. We get that on UHD Blu-ray already with the same codec with higher peaks (UHD Blu-ray caps at ~120 Mbps). There's no more detail to resolve in the video than what the lens that recorded it can resolve - and state of the art lenses used for film cap at around 5K. 8K is pointless as fuck right now, with even 70mm film resolving less real detail.

>I'm talking about actual encoding. Human hearing is already capped at ~20KHz, and we can already losslessly capture and reproduce sound up to that, so there are no improvements to be made there.
I don't disagree with that.
But audio encoding has been also been stuck on 'channels' for far too long, atmos is just discreet sounds with a point in 3D space they should be emitting from.
See above?

I don't care, as soon as you start you start saying because its freetard. Frankly I know that's all you guys care and usually lie about the rest because that's how you are.

Made by BASED VideoLan.
Gotta love that traffic cone.

Attached: file.png (194x259, 42K)

Cont. The main point is that consumers consume media for the content. There's plenty of media out there that's only available via a LD, DVR recording, VHS or DVD rip that people might find genuinely enjoyable. Quite a lot of it will never be remastered (if it's from film) or cannot feasibly be remastered (if it's shot digitally). Some great content was shot on cheap camcorders like some HK movies, which looks terrible from a technical perspective. Sometimes the masters are lost and the only copy sitting around is a nth generation bootleg DVD. While remasters are definitely preferable to what janky sources these types of content currently are sitting around in and eventually technology will advance to the point where 8K or VR will dominate, 1080p and 4K and the inferior audio and video codecs it's currently encoded in is more than enough to create content consumers will pay for/enjoy, and increasing the resolution or using a new codec won't suddenly turn a turd into anything more than a polished turd.

Reality is what you're missing. The mp4 and other videos use hardware not software to decode, it will be years before that happens. You're also listening to marketing saying it better but in hand specific controlled tests that have no practical application to real world usage. I say this because if you remember the same style of hype about webm's vp8/9 showing that its better in their tests. But every YouTube video I download the mp4 video is always smaller file size for the same resolution, always.
It could be good once hardware support happens, idk. Just remember its freetards and they are not going to tell the truth if something non-freetard is better.

Realm with the list of supporters and how hardware intensive this is it just looks like another way for google/netflix to shift processing onto consumer devices so they can save money.

YouTube's AVC offerings have lower filesize on purpose. You can only get it if you lack VP9 support, and virtually all desktops and laptops do.

AVC is there for tablets and phones that can't hardware decode VP9, there's no reason for it to be high bitrate. And bitrate is entirely controllable.

Do have it*

bitmovin.com/demos/av1

Attached: AV1.jpg (1080x1752, 431K)

hevc 10bit at blur-ray bitrates means there will be no banding and it is hardware decodable on my ayyymd poolaris.

1080p H.264 already looks fine at Blu-ray bitrates. Banding at 35-45 Mb/s at 1080p is a master problem, not a bitrate, bit depth, or a codec problem. The use of HEVC 10-bit or AVC 10-bit at that bitrate wouldn't make a difference, besides the fact there's no hardware decode support for AVC 10-bit, but that's just because there's no money in it. And if you're going to be using a desktop GPU software decoding 1080p AVC 10-bit is still easier on the CPU than hardware decoding HEVC 10-bit. HEVC is way harder to decode than any form of AVC, and offloading that shit to the GPU means you can't run as many meme shaders on the GPU.

Is VVC dead? I know it unscientific but comparing their wikipedia articles makes it seem like there's not much interest surrounding it despite tests showing it ahead of AV1 by an impressive amount.

Attached: dc567y2-c5236632-994d-471b-9436-754e288f1f13.jpg (400x400, 29K)

Attached: asdsad.gif (360x150, 1.82M)

av1 can achieve same picture quality as hevc or avc.
companies are working on both more efficient encoders and decoders.
it will take a while for av1 to become widespread but some people speculate that av1 hardware decoding is coming in 2020 or so. what this means for you is that you shouldn't buy any new tech until then and be up to date with this if you watch a lot of videos and shit.

>av1 can achieve same picture quality as hevc or avc
while using less bandwith

>data caps
stop living in a third world shithole

Is the encoder called dav1e?

I guess this is nice, another format added to vlc so it keeps just works going.

i only watch chinese cartoons.
bluray is avc 8bit.
so yea.

hevc 10bit works fine with madvr + amd fluid motion video.

I don't have a data cap for my home connection and haven't for a long time. However, it's still prevalent for mobile.

t. NEET

>faster on every platform

"Decode faster" doesn't really mean much to me when everything is the same speed, ie "instant." Video only plays at one speed, unless you want to watch it a 2x or whatever.

>seems like it achieves better results than webm
WebM is a container format. AV1 can be used within WebMs.
>at the cost of needing more processing power
Yes. Every new video standard generation is more complex than the prior one. HEVC is more complex than AVC and AV1 is more complex than HEVC (and VVC/AV2 will be more complex than AV1). Each new generation uses more resources to encode in order to provide a better compression (not necessarily on the whole bitrate spectrum though; newer codecs usually focus on low to moderate bitrates).

VVC is still in development, user. Not much of a point comparing them now.

Decoder doesn't decode fast enough = dropped frames
aomdec drops frames for high resolution footage even on high end hardware.

Is it worth it?
My nine (9) years old camcorder record HD with MPEG-2 at 50Mbps with 4:2:2 8bit colorspace and it still looks pretty decent. Lots of cable tv is in mpeg2 too at around 15-18 mbps.

Are you a retard? Decoding faster consumes less power, and while we don't have hardware decoders this is crucial.

I'm good with HEVC, thanks.

Let me guess, it's one those codecs shilled by anime encoders that you need a watercooled pc to decode?

Well, you guessed wrong.

>it's faster
Don't care. It's still way too slow, way slower by magnitude than x264 and even VP9 - which is slow compared to x264 and HEVC.

I've tested SVT-AV1 and it will have problems at 1080p with 16 GB RAM on most videos. It's fine if you have 32 GB.

>H264 isn't an open/free
It's not, nor is HEVC. But it works. And you can wait an hour instead of a week for a small videoclip to be encoded.

In all seriousness, I'm all for AV1. It would be great to have a better working functional usable alternative to VP9 which is the best free codec there is as of right now.

AV1 just isn't there with any encoder, not with dav1d or Intel's. VP9 is usable, it's slower but not magnitudes slower compared to HEVC. Improvements of 10% in dav1d may sound great but if it's an hour vs two days encoding time to begin with then having 10% off that two day encoding-time means nothing.

>I've tested SVT-AV1 and it will have problems at 1080p with 16 GB RAM on most videos.
How many threads did you use? I didn't come close to using 16GB while encoding 1080p with 16 threads.
>AV1 just isn't there with any encoder, not with dav1d or Intel's.
dav1d is a decoder. aomenc (libaom), rav1e and SVT-AV1 are the encoders currently available. More proprietary encoders are supposed to follow.

When a Core2 @ 2ghz can decode 1080p on Kodi is when AV1 will be ready.

>rav1e and SVT-AV1
I'm guessing rav1e was the other I tested. It was extremely slow.

I think I just used 12 threads with SVT-AV1 when I tested it. It did eat over 16 GB easily.

>wen muh neetputer can do latest codec it will b ready

No, it's when my shitty HTPC in the lounge can play it.

>While single-threaded


call me when they actually fucking commit themselfs and go multithreaded or even better use the fucking gpu properly

so before it took an eternity
but now it takes 24% on a single thread less but an eternity what so ever

good to know that they are doing what they can

Now go ahead and read the rest of the fucking sentence.

Why is Qualcomm not behind this shit?
Probably purely because they don't want their investors to suddenly expect implementation of a AV1 circuit on their SoCs, while both that hardware and software are far from ready to leave R&D?
Thoughts?

David will be a real star once AV1 becomes mainstream.