There are any reasons why meme-encoders stick with x264 10 bit while x265 10 bit is hardware accelerated beside fucking...

There are any reasons why meme-encoders stick with x264 10 bit while x265 10 bit is hardware accelerated beside fucking with people?

Attached: tumblr_np6kxjweRU1r63wjwo1_500.gif (500x658, 943K)

Other urls found in this thread:

developer.nvidia.com/nvidia-video-codec-sdk
anandtech.com/show/12835/arm-announces-maliv76-video-processor-planning-for-the-8k-video-future
itu.int/rec/T-REC-H.264-201610-S/en
avisynth.nl/index.php/External_filters#Debanding
en.wikipedia.org/wiki/MPEG-4_Part_2#Criticisms
en.wikipedia.org/wiki/H.264/MPEG-4_AVC
matrox.com/video/en/products/developer/hardware/m264/
google.com
builds.x265.eu
bitbucket.org/multicoreware/x265/src/2aa737a99f5148f11031e764fd1bc57bfd04fd8b/doc/reST/releasenotes.rst?at=stable&fileviewer=file-view-default
my.mixtape.moe/ntmbwp.7z
my.mixtape.moe/ewpxnj.zip
twitter.com/NSFWRedditVideo

Wider support. Plus anything with x265 hardware acceleration support supports x264 anyways.
No reason to support only the newer one and cut out the older but still widely adopted one.

Gpu hardware acceleration is for 8 bit x264 for that's bluray/hdtv standard, 10 bit x264 is cpu only.
Even my phone support 10 bit x265.

I guess we know the reason then, because people () think x264 10-bit is supported everywhere and they don't realize they're wrong.
It's especially stupid since anime encoders were always pushing the boundaries without caring about widespread support (eg "use a PC for playback"), yet now so many are happy to sit on a non-standard non-hardware accelerated older standard despite something better not only existing but seeing mainstream adoption by the media industry.

>10bit x264
>non-standard
the profile is well defined, retard

>Gpu hardware acceleration is for 8 bit x264 for that's bluray/hdtv standard, 10 bit x264 is cpu only.
That's where you are wrong, kiddo. This isn't 2010.

Elitism.

>Gpu hardware acceleration is for 8 bit x264 for that's bluray/hdtv standard, 10 bit x264 is cpu only.
>x264 10-bit is supported everywhere and they don't realize they're wrong.

My ChinkPad does 10 bit x264 hardware decoding, it only does it at 8 bit though (that's the thing you guys seem to misunderstand). While it does not do x256 10 bit (or 8 bit) in hardware at all. So I'm pretty happy all my anime trash is x264.
You guys are very wrong about 8/10 bit x264 here.

the fansubbing/encoding scene is dead
the stragglers are just using what works, no incentive to experiment with the newer codecs

>My ChinkPad does 10 bit x264 hardware decoding, it only does it at 8 bit though
What does this even mean?

The output is 8 bit even on a 10 bit screen with a 10 bit encoded file.
What's so hard to understand? This is a 18+ site.

Lmao, modern gpu support even 12 bit in x265 but they're stuck at 8 bit in x264.
See by yourself.
developer.nvidia.com/nvidia-video-codec-sdk

I could watch 4k HDR and use less cpu than meme-encodes, let that sink in.

10 bit h264 was conceived as a profile purely for professional applications.
Also
>using hardware decoding

Ok sure. Your chinkpad still doesn't have hardware to decode Hi10P h.264 video.

No phones support H.264 10bit hardware decoding

Even ARM only this year introduce that feature

anandtech.com/show/12835/arm-announces-maliv76-video-processor-planning-for-the-8k-video-future

>Meanwhile on the features front, the latest block adds support for 10-bit H.264 encoding and decoding, the one major codec/format that wasn’t already present on the V61.

And this won't be available in any phone until 2019 or 2020 earliest

>buying a matrox to watch chinese drawing cartoons

Top autism.

I'm pretty convinced that you're actually retarded and have no actual idea what you're talkimg about.

it's good enough. h265 was a mistake. av1 when?

maybe because hevc is not open and they're waiting for av1 to stabalize? ffmpeg already includes experimental encoder.

Encoders can't make a move while the h.265 licensing is an absolute mess.
>three patent pools each with different licensing agreements
>one of them is basically a pool of NPEs
>a fuck ton of individual corporations that offer their own way of licensing and a fuckton lf other NPEs that are just there to patent troll.

This. A lot of them got pushed out by economic forces, and are not replaced due to future generations of ignorance. They don't have patience to learn another language. Some of them gave it up to become SJWs. Even anime isn't safe from globalism.

Isn't AVC also not open? It's not like royalties are a concern for people who are illegally distributing copyrighted material, anyway.

a profile is not a standard retard. 10bit never made it officially to h264 that's why it got added to h265

x264 is the codec h264 is the standard you mong

Long time encoder here:

10-bit H264 is faster to encode and uses less SW decoding resources than 10-bit HEVC

cuz h265 is a meme, there is no reason for it since we have nice free codecs already.

Actually AV1 is fucking dogshit and of use to fucking no one right now. HEVC just takes longer to encode, good compression ratio real time 1080p24fps alone takes a modern 8-core processor to do.

Most encoders apply smoothing filters to video to further increase compression efficiency especially to noisy movies and chinese cartoons which further increases cpu resource requirements for real time encoding.

h.264 is the codec and standard and x264 is a codec library you fucking retard. x264 isn't the only codec library, the other notable one being cisco's openh264. 10 bit is official but no one ever thought of making hardware decoders for it.
itu.int/rec/T-REC-H.264-201610-S/en
You're more than free to search for "hi10p" in ITU's own recommendation as they published it.
Get your shit right next time before you make yourslef look like a retard.

You don't need a supercomputer or a new gpu to decode x264 10 bit

>he doesn't know the difference between a codec and a codec library

Friendly reminder to filter tripfags.

Attached: 1510017806702.png (326x259, 13K)

retard

A lot of people still use computers and phones that don't have hardware acceleration for x265

Nobody said it does. The question was why x264 instead of x265. You're getting really confused here it seems.

Holy shit and you try to make arguments why x264 is bad while being THIS retarded.

Took off your trip to say that? Cute. Now kill yourself, you'll certainly attract more attention in hell.

>Most encoders apply smoothing filters to video to further increase compression efficiency especially to noisy movies and chinese cartoons which further increases cpu resource requirements for real time encoding.

It's not like you're streaming or delivering video on demand. Why would you use real-time encoding?

>Lmao, modern gpu support even 12 bit in x265 but they're stuck at 8 bit in x264.
You do realize that the whole point was that 10 bit x264 will hardware decode on a GPU as 8 bit.

no it won't
also, x264 is a codec library that is akmed at software, not hardware. Just stop posting so that you don't embarass yourself any further

I'm not the tripfaggot, in fact I had no idea there was even a tripfaggot in this thread because I have everyone set to anonymous. You're actually more annoying than the tripfag here, nobody gives a shit you got triggered by his tripcode so hard you had to blogpost about it.

Thanks for derailing the thread, retard.

Because each source is different and requires changes to the smoothing/sharpening/denoise/color debanding avisynth filters in megui. Also there's a lot of fucking filters to use and some sources require like 5 of them.

avisynth.nl/index.php/External_filters#Debanding

Problem is to see the full effect of the filters you usually have to encode a significant portion of the video or the entire thing to release a polished product.

>66361870
>66361911
Why are you samefagging?

Attached: (you).png (600x600, 142K)

Great contribution to the thread jackass. You sure showed that tripfag who's boss.

Go back to /v/ or r3ddit.

# Filter any tripfag
/^!/

God, this board is such a joke.

that's a good point. maybe they expect that people who can't afford to buy anime will use older hardware. then again, i've seen plenty hevc encodes so that might not be the case. i'm pretty sure hevc takes more time to encode and if you're some college student pushing out subs time might be important for you. dunno.

Attached: 1527960589476.jpg (593x625, 64K)

I don't get it. Wouldn't it be easier to give enough bitrate to the encoder to avoid any artifacts rather than try and minimize the issue in postproc or is that unpractical? Although blurring things up ever so slightly can be useful to curb random crap such as film grain and increase compressibility.

I do mostly light encoding with FFMpeg for WebMs so I might be missing something.

It's impractical internet-wise because you'd have to climb to 10mbps or more for 720p movies alone for that not to matter. People would much rather download a well compressed and PP'ed h264 1GB movie file than a lazily encoded 10GB one. Yify releases being one example.

Well, x264 encoders are much more mature and refined than the more known x265 implementations right now, so that might be a pretty good incentive to stay with AVC.

What confuses me is why people were so eager to embrace AVC a decade ago. If compatibility were an issue, wouldn't it make more sense to stick with ASP back then?

Hope everyone did their job and filtered the tripfag.

barneyfag is that you?

Best codec for HDTV is mpeg2 at 15-20 mbps.

I remember when my parents baught my first laptop that disk space was a problem. I'd say that encoders needed something that could utilize better compression. Also, by the looks of it, MPEG-2 was almost the same as ASP in this regard. en.wikipedia.org/wiki/MPEG-4_Part_2#Criticisms
I only remember stories about how shit DiviX was from those times lol.

Attached: 1527511776467.jpg (609x540, 110K)

No it isn't and why did you pull an arbitrary bitrate out of your ass despite the fact that different sources require more or less bitrate than that with mpeg2?

>What confuses me is why people were so eager to embrace AVC a decade ago. If compatibility were an issue, wouldn't it make more sense to stick with ASP back then?
That was also during the switch from SD to HD resolutions so a better encoder made more sense. And at least anime fansub groups released xvid encodes of their releases for years after the switch.

>not downloading your cartoons on real video with variable frame rate

dude, the day i discovered youtube downloader i creamed my pants from excitment. i downloaded so much useless shit and that's when i came across anime and Jow Forums for the first time. from that day user was never the same.

More like Cinepak at 180 MBit/s, amirite

Anime /is/ globalism ._.

>H.264 or MPEG-4 Part 10, Advanced Video Coding (MPEG-4 AVC) is a block-oriented motion-compensation-based video compression standard.

en.wikipedia.org/wiki/H.264/MPEG-4_AVC

when shit talkers shit talk shit talkers baka

>h.264 is the codec and standard and x264 is a codec library you fucking retard. x264 isn't the only codec library, the other notable one being cisco's openh264. 10 bit is official but no one ever thought of making hardware decoders for it.

matrox.com/video/en/products/developer/hardware/m264/

>Matrox M264 family of PCI Express cards features hardware-based multi-channel 8- and 10-bit H.264 encoding, decoding and transcoding capabilities. They provide an instantaneous H.264 quality and density boost offering the pristine quality needed for broadcast contribution, production and distribution. With the capability for up to three channels of 4K XAVC encoding/decoding in a single slot card, the M264 family enables OEMs to provide multi-channel 4K production servers in a PC platform. Supporting the H.264-based Sony XAVC and Panasonic AVC-Ultra mezzanine codecs, it makes live 4K production as easy as today’s XDCAM HD workflows in PC platforms.

>no it won't
Source?

>not filtering any wofag based on md5
won't make it

>tumblr_np6kxjweRU1r63wjwo1_500.gif

Attached: 1524961151449.jpg (1440x810, 150K)

>anime

>x265
Enjoy your 12 hours encoding time for one movie while x264 is 3 times faster :D

10 bit is a meme. Mostly because it only adds 3 extra shades between closest colors, do it doesn't improves gradients by much - instead of normal thick banding you get 1/4 think banding. Then, in order to leverage 10 but color you need 10 bit monitor which you likely do not own. Without it, color is emulated via dithering. To top it off, you should encode it with dithering to start with, so that there is no banding and no color benefits to be had. The same way you always dither the audio when rendering it. Digital media 101.

You make a good argument, but try refuting this: google.com

Protip: you cant

The most recent snowfag10-bit x265 encoder is actually pretty fast but yeah it still takes a while. I knew that pain all too well especially on my quad-core i7 2015 macbook """""pro""""" all too well.

You should try it, v2.8 has been a significant leap since v2.1.

builds.x265.eu
bitbucket.org/multicoreware/x265/src/2aa737a99f5148f11031e764fd1bc57bfd04fd8b/doc/reST/releasenotes.rst?at=stable&fileviewer=file-view-default

Attached: 1528569840560.jpg (720x720, 56K)

The point of 10-bit encoders is better color accuracy which is very important since 99.99% of encodes use a low bitrate and use 4:2:0 chroma sub sampling. 75% of all color information must be made up so it's imperative the 25% you do have is as accurate as possible for more accurate color reproduction.

Or you could use full color thereby eliminating the need for meme snowflake formats.

There is no "full" color, it's all approximations of the real thing. Not even 16-bit color video can store all the color information of the color spectrum which only humans can perceive.

because of licensing fees

Device manufactures need to pay like $0.50 for every device that supports it

So Intel/Amazon/Apple/Nvidia/Google/Microsoft and few others etc decided that instead of paying $20 million each, instead made a big group to develop AV1 which will be completely free in a big Fuck-Your-Fees to H.265


So that's why make device manufacturers have avoided including x265 on their phones because of Fees and not wanting to proliferate it as to give AV1 a chance to compete

It's gonna be a while before AV1 becomes mature enough to take seriously. Last I heard it's 3,000 computationally expensive as HEVC. Hope VP9 can hold out until then and mpegs future FVC successor.

By full color I mean the 4:4:4 mode, otherwise known as RGB.

Actually let's do the math. I figure the core reason of even using subsampling is saving space. But with anime rips weighing in at 15 gigs a pop I don't imagine that's a problem to you. But anyway, subsampled data carries three channels - luma and two colordiff channels. The latter two are subsampled to half the resolution, and quarter the pixel count. So instead of 3 bytes per pixel it has 1 + 0.25 + 0.25 = 1.5 bytes per pixel, so half the size - good improvement. But in 10 bit color that's actually 25% more (10 is 125% of 8), so 1.875 bytes per pixel. Additionally, you doubled the color data frequency and it will compress worse, exact coefficient depends on the settings, but let's say it's 0.2, which makes it 1.25 + 0.75 + 0.75 = 2.75 bytes per pixel. At this point going to 3 bytes per pixel full RGB color is only a tiny size bump. And you get much superior color representation, as it is double the resolution you normally use.

You need to redo that math chief, 10-bit encoders actually REDUCE final video size at any given CRF value by 5-20% depending on motion and patterns.

4:4:4 is sadly impractical due to not only lack HW decoders but it's twice as big as 4:2:0 video.

The only way a format that uses more bits per unit of data is going to make a smaller output file is if it uses worse compression settings. You have been memed on by 10 bit fags that believe in their own memes.

You are already using files that are ten times the size of a normal rip, don't tell me that bumping it to twenty times the size is too much for you.

Here are 8-bit and 10-bit HEVC videos transcoded from a lossless 8-bit fraps file: my.mixtape.moe/ntmbwp.7z

frame screenshots: my.mixtape.moe/ewpxnj.zip

As you can see there is a huge fucking improvement on the 10-bit file vs 8-bit one transcoded from the same lossless source. 10-bit encoders make video with significantly reduced color banding despite source being 8-bit and anyone that says otherwise sucks cocks on an hourly basis.

This was done with an older snowfag encoder btw.

Look at all of these poorfags who can't afford to store lossless rips on their machines
Look at them and laugh

Yeah, 5% improvement in quality is surely worth wasting 1 million times the space for some chink tunes you'll rewatch maybe 2 times in your entire life.

Attached: Video+Sizes+Raw+video+bitrate_.jpg (960x720, 71K)

Why did you not enable dithering for encoding? Only to justify using 10 bits because otherwise there is no difference? Kys yourself, retard.

Mostly because it takes ages to encode h.265. It may be efficient, but the encoder is slow as FUCK.

It's not only about not having enough space. I have super slow Internet connection and having an option to watch something that doesn't look like dogshit and doesn't take an eternity to download is very useful.

Because ...

>beside fucking with people?

Damn you got me.

the codec IS the standard, you tard. There may be different implementations of encoders and decoders out there. Some proprietary, some free, and some focused on using hardware. At the end of the day, they all need to conform to the codec standard for inter-operability on the decoding side. That's what makes the codec. Let me repeat, the codec is the stamdard that ll needs to conform to. H.264 is the codec/standard. x264 is just a codec library made to utilize and comply with the codec/standard.

play any video encoded with hi10p and watch the stats. Attempting to force 10bit onto an 8 bit decoder would only give you fucked up colors

Because x265 is not ready yet, it has some quirks with psy-rdo.

it's ready and quite usable
only problem is that using the codec to its fullest extent would mean patent litigations

Not ready.

licensing is a mess foe h.265, which is why the adoption is slow in the first play

It was a test showing raw lossless 8-bit to lossy 10-bit you tard. How the FUCK AM I GONNA DITHER 8-BIT TO 10-BIT. The test just shows that a 10-bit encoder IS better than an 8-bit one because encoder internal precision.

You really are a retard.