AV1 CODEC ENCODING TIMES DROP TO NEAR-REASONABLE LEVELS. ANOTHER STEP TOWARDS THE DEATH OF x265

streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=130284
streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=130284
streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=130284

RIP in piece x265

Your days are numbered.

Attached: 1200px-AV1-logo.png (1200x554, 56K)

Other urls found in this thread:

youtube.com/testtube
youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS
bugzilla.mozilla.org/show_bug.cgi?id=1490877
twitter.com/SFWRedditVideos

Won't matter too much until new ARM SoCs ship with AV1 support and devices start getting replaced

>"If you use this same exact string with the current version of FFmpeg (I tested version N-93083-g8522d219ce), the encoding time drops from 226,080 seconds (45K times real-time) to 18,196 seconds, or about 3,639 times real-time, a speedup of about 12x. Still about 63 times slower than x265 and 80 times slower than LibVPx"
LMAO, encoding 10-bit 1080p video on my ryzen 2700 on a custom slow preset gets me 20-30 FPS so about real time for 99% of video content.

>x265 encodings: movies that people watch, world classics on torrents, new releases appear every week
>AV1 encodings: 1.5 niggers seed their anime shit with gay cocks after 1 month of encoding

..and 4channel will support wemb and gif only until 2155.

THE TIME HAS (ALMOST) COME AND SO HAVE I
ah fuck I can't wait to convert everything to AV1 and FLIF

Attached: 1539280213768.png (271x269, 106K)

Also Navi is probably going to release with AV1 HWenc/dec support as part of their mandatory contribution to the AOM.

>encode libaom-av1 on 6c12thread sandybridge xeon
>get ~0.5 fps encoding speed
NOPE

>Let’s get practical. Most codecs have presets that lets you trade off encoding time for quality. For example, with x264 and x265, the presets have names like slow, very slow, fast, very fast, and placebo. With AV1, the presets are controlled via the cpu-usedswitch, and you can see in the batch above that I used cpu-used8in pass 1 and cpu-used0in pass 2.
>slow, very slow, fast, very fast, and placebo
>cpu-used8in pass 1 and cpu-used0in pass 2.
maybe the software could be superior but boy that's awful nomenclature

And he uses libaom for his tests. Imagine what SVT-AV1 would be able to achieve.

Google is already using AV1 on Youtube. Once that shit is optimised all content above 1080p will probably be AV1.
Netflix will also be following suit.

Ah never mind you have to enable AV1 on youtube.com/testtube

youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS

wow

(X)doubt

using firefox quantum

Attached: 1545155301158.png (945x283, 22K)

does youtube use h265?

Do you have av1 enabled in about:config?

no

No, they have been using VP9 for content above 1080p.
Everything else is x264.

what make av1 better than x265?

yeah, why is it not working?

Attached: 1530055902640.png (787x114, 3K)

nothing, it's just newer to autistic weebs want to jump on its bandwagon

Better compression.
Not patent/royalty encumbered.

thanks, but how can I encode my raw videos usin AV1? Handbrake doesn't support it

right now, AV1 is MANY times slower than h265. if I were you, I'd use handbrake and encode my raw videos to H265 instead.

Not much as of yet, because the gains in compression efficiency are largely outweighed by mammoth encoding times and CPU usage. But that's because the codec is new and still needs lots of work done to it.
I'm not a fan of the Google botnet, but the fact that they have a vested interested in turning AV1 into the best codec on the market makes me confident they will produce something pretty good.

You need to do two things: change the setting AND enable AV1 with your youtube channel by using this link:
youtube.com/testtube

You don't want to yet. It is prohibitively slow.

someone can explain pic related graph?
Does it mean the more cpu threads used the lower the quality of the video file do to how it works with p and b-frames?

Attached: 1522893096077.png (700x432, 73K)

>wow it's now 2 FPS instead of 1, amazing!
Meanwhile I can encode using x265 at 30 FPS

I just want to test it out for fun. Also does it support 10-bit like h265 does?

Awesome, the smaller the files the better. I have PURE x265 locally for my plex set up. Once AV1 comes out, I'll replace everything.

Better compression than anything. Smaller files.

No royalty fees. 100% FOSS FREE AS IN GREEN PEEPER

Who gives a fuck if it encodes slow as long as it decodes fast enough?

Can't wait for FLIF, AV1 and Opus to be the only compressed media formats I'll have to work with.

No you won't. We have affordable 8K TVs/monitors coming out soon (maybe 2-3 years) and content to go with it as well. Maybe if you enjoy watching 240p video on black and white CRT TVs or wallmart phones but that's about the only reason I'd see any sane person using AV1 for personal has.

Also got media.mediasource.experimental.enabled?

see you in 10 years

>You need to do two things: change the setting AND enable AV1 with your youtube channel by using this link:
yeah but it won't let me enable it for youtube because: even though it was already enabled in about:config

>Who gives a fuck if it encodes slow as long as it decodes fast enough?
piracy

didn't help either. fuck it.

media.mediasource.experimental.enabled was not necessary, but it was a known bug:
bugzilla.mozilla.org/show_bug.cgi?id=1490877
I updated my FF and it works now

-cpu-used=0 is able to saturate multi-core CPUs better than faster(cpu-used=8) speed settings.
thread usage is controlled with the -thread option

Attached: cgchghjd.png (788x407, 77K)

is this AV1?

Attached: 1522148531896.png (675x232, 82K)

GOD FUCKING DAMMIT

This is like trying to get QEMU MTTCG to work for x86 emulation on ARM chips isn't it? How many decades before AV1 gets at least 200% as compute intensive as x265 for the same quality?

>inb4 lul JUST use HW encoding
No, fuck you. I don't want 50mbps 1080p video.

no its h.264/mpeg4avc

I enabled it and opened a very recent video and it still doesn't use it? how the fuck do I force it?

Isn't dav1d 2.0 supposed to double decoding speed too? On 1.0 I still get dropped frames on 1080p video on an i5 4570.

its only on certain videos since it takes a long ass time to encode in av1

It is faster to download than to encode.

LOL!!!!!!!!!!!!

Does that mean av1 videos would play longer and more detail than webm?

heres a playlist released by the youtube devs of av1 videos.

youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS

make sure its set to always prefer av1

Netflix, Amazon, Apple and Bluray are already using HEVC on HD and UHD content.

AV1 is DOA

Attached: hevc-hevcadvance-p13918404zo.png (582x900, 102K)

>It is faster to download than to encode.
that's every video, assclown

nigger they are going to end up switching to av1 to avoid royalties and licensing issues. Av1 is going to supersede it.

is my thinkpad gonna be able to play these encodes okay

Attached: 1485333021068.jpg (2335x2507, 685K)

is it just me or is AV1 on youtube worse looking? it seems instead of upping the quality a bit while still saving on bandwidth, they went with saving maximum bandwidth for slightly lower quality footage.

nope.

Attached: 1535728677673.jpg (960x536, 44K)

KEK

>set it to always prefer AV1
>can't remove the preference or disable it anymore
>can just set it to auto
fuck I want it disabled.

idk but my i5-4590 is playing it fine
a 1080p av1 stream is using about 20%

Will companies switch to a cheaper, faster, better codec? Nah, you're right, they'll keep paying royalties for an inferior product.

It's not that simple m8. Using AV1 makes the electricity bill skyrocket and they'll have to have thousands of more computing resources than they already have. Almost like paying the hevc royalties for 100 years desu.

yeah but streaming the same video happens a lot more often than having to encode it. you save in bandwidth exponentially, and bandwidth is the most expensive thing they have to worry about.

Billions of HEVC devices have already shipped and VVC is expected to be finalized by the end of 2020 so AV1 is pointless

only till it matures and software optimizations improve performance and av1 hardware encoders/decoders are released in the next few years

it is not pointless
neck yourself.

20-30FPS on slow preset wtf? I get that on fucking x264
you are smoking crack if av1 is that fast for you..

custom slow, means he fucked with it

see you when I'm dead

>libaom
It could be as fast as libvpx and it would still be shit.

SVT-AV1, rav1e and EVE are the only hope for this codec.

sounds like it's shit then

agreed
even vp8 encodes like fucking shit. x264 and x265 are far nicer and since I don't host any content I can use them for free :^)
hopefully av1 becomes usable

AV1 is just a meme now. HEVC (h265) is already legit.

yea, my shitty old arm boards only decode h264 so that's all I encode still LOL

based and redpilled

Nice
Just tested newest shinchiro build of mpv and I can smoothy play 1080p

Attached: Dua Lipa - New Rules (Official Music Video)-k2qgadSvNyU_00_00_30_0001.jpg (1920x1080, 503K)

It still matters else we would have seen a HUGE masa adoption of VP8 amongst websites other than 4chinks.

nah, VP8 is still not even as legit as h265 is right now.

I knew some one would post a screen of that video from the playlist

too old

Attached: 1502024360526-2.png (499x396, 246K)

>Netflix, Amazon, Apple
Those three are part of AOM and support AV1's development.

>Mac and iOS (40% of united states) still doesn't support webm/webp
>thinking they'll bother supporting av1 or flif
Hahaha

To be fair a third world country like the US doesn't really matter.

>8K
Cool placebo

Whoops sorry Jow Forumsoys, meant to say I got that encoding x265. AV1 of similar params would probably get me 0.1 fps best case scenario.

NOPE. 8K video is actually true 4K video, see chroma sub-sampling. For an 8K display you actually need 16K video to get true 8K video. sad desu

no

Yup, it's true. Find me the 4:4:4 4K masters of movies. You'll only find 4:2:0 encodings even in blu-ray discs.

>protip: you can't

There's a reason 4:2:0 is popular and it's that it's virtually indistinguishable from 4:4:4.

hahaha

Parents, royalties

>no hardware decode or transcode
I don't think you really grasp what drives adoption of new standards

Those big can do it but pic related no-one else is going to be able to maintain/negotiate realistically this number of deals (the pools but then also the individuals)

Attached: hevc-patent-holders-2018-03.jpg (970x516, 65K)

humans are better at seeing black and white than they can color so what not save a hell of alot of bandwidth by tossing shit that wouldn't make a difference if it was included?

No, there are plenty of chroma sub-sampling artifacts that can easily be noted and the video as whole looks blurrier when compared side by side next to video with no chroma sub-sampling. The reason this is used at all is because nobody knows or even cares chroma sub-sampling is used at all. This is all made WORSE by the fact that said chroma sub-sampling is done with shitty HW accelerated bilinear upscalers and you have to force your media player to use high quality bicubic upscalers which tank video decoding performance (not sure if this is even possible on phones/tablets).

There are only 2 ways to avoid all of this malarchy:
1.) Watch "4K" video at 50% zoom and force your media player to use bicubic downscalers to properly supersample the luma. This isn't possible on phones/tablets/TVs.
2.) Encode "4K" video to 4:4:4 1080p, at least with ffmpeg bicubic downscalers are enabled by default. Resultant video can be played anywhere.

that's clearly coming with all the companies that backed the codec, dumbass.

and by that time VVC will be out and will smoke AV1

>10-bit
lol people actually fell for this shit?

>10-bit
snakeoil

slower is better