Why isn't this more popular?

Attached: h265-hevc.jpg (728x400, 23K)

4k isn't popular yet

Companies don't care because they don't want to put up with the patent pools.
Anime encoders don't care about it too much because AVC is at least on par for high bitrate encodes or because they don't want to invest the additional time.
Also HEVC certainly has its benefits, but MPEG's greed slowed down the adaptation while motivating others to come up with a royalty free successor as soon as possible. I'm pretty sure they won't make the same mistake with VVC.

And since we're on the topic: Does somebody have experience with the JCT-VC encoder? After seeing how much more efficient BPG is when using it instead of x265 I wanted to give it a try, but all those options give me nightmares.

Because it's a proprietary standard riddled with royalties.

Because hard drives are under $0.02/GB

because no one should need a 128 core quantum based experimental hyper loop computer to view a fucking video.

H.264 is fine, fuck Xvid was fine but people were too fucking retarded to tune their encodes correctly.

Too time consuming to encode with.

AV1 exists

A) Unless you are using a very low bitrate, HEVC has no discernable advantage over h264 in the 480p-1080p range, which is what 99% of people are watching.

B) Insane licencing with at least 2 patent pools and also singular entities with which you need to negotiate royalties

C) AV1 will be rolling out hardware support in 2019, it's free.

because my toaster can't run it without dropping the spaghetti

i hate downloading JAV porn and then finding out its encoded in h.265 that my pi struggles to play smoothly. fuck you... FUCK YOU!

>fuck Xvid was fine but people were too fucking retarded to tune their encodes correctly

Attached: 1528647115596.jpg (1500x1378, 425K)

>my pi
Don't be a soiboi.

>fuck Xvid was fine but people were too fucking retarded to tune their encodes correctly.

You're forgetting that those 250M files were 4:3 240p

lmao even a 2011 quad core cpu can watch 1080 60fps H.265 without lag on software decoding.

LOL xvid was so fucking bad
90% of h264 encodes are fucking terrible as well.

Will zen2 have av1 hardware or is it too finished already to add av1?

>software decoding
why would you do that when you can hardware decode h264?

retard detected. Any APU or GPU from the last 5 years has hardware decode.

>90% of h264 encodes are fucking terrible as well.
how so

I don't care how popular it is elsewhere, I reencode all my videos in Handbrake to x265 and I get one fourth to one seventh of the original file size with absolutely no quality degradation.

DO you ever look at them?
Sloppy packaging in their container
out of order streams, all the fucking time
missing bits and timestamps randomly
don't crop the fucking black bars ALL THE FUCKING TIME, nearly always some 5px black bar or worse, double black bars that are like 50px on top & bottom
etc.
We're lucky video players work even with all these problems.

i usually look for rarbg fgt or similar

public stuff is 10x worse LOL

This. I have a q9550 HTPC, CPU usage is 1%, GPU does the rest.

Were you around when 264 became a thing? It also too years to even start being anything remotely popular.

>I get one fourth to one seventh of the original file size with absolutely no quality degradation.
LOL, get glasses you retard

HEVC has been out for five years, it's not going to happen

20/20 in my left, 20/15 in my right eye, retard

Apple, GoPro, Blurays, OTA TV, etc all use HEVC.

>Blurays
UHD Blurays.

Apple supports HEVC, but they are not a content provider in any major sense, Bluray is h264, only Bluray 4k/UHD are HEVC which are REALLY niche since they require you to purchase a new bluray player.

In streaming, HEVC holds the small niche of 4k+, but even that will be eaten by AV1 which is designed to be more efficient than HEVC at those resolutions.

Finally, the thing that sealed HEVC's fate was that it could never succeed h264 as the de facto standard video codec on the web, due to it's insane royalty scheme.

AV1 instead will do this, which means it will dominate video at least until the next generation of codecs arrive.

>absolutely no quality degradation.

Son, I have some bad news...

>Even with hevc, using some of the most aggressive CPU based encoding methods to squeeze the absalute most out of the video quality, AVC/x264 still continues to win vs HEVC/x265 in the video quality/detail arena at appropriate bitrates.. however, HEVC at say 1000-1500kbps vs avc @ 3000kbps produce fairly similar images. Unfortunately HEVC doesn't really improve much with higher bitrates, this is where AVC wins considerably. This is also why HEVC is ideal for media streaming services and stuff like youtube, as one can theoretically cut the bandwidth by between 14-38% with near transparent quality (perceptible video quality to most peoples eyes, essentially it'll be hard to spot the differences from one to the other without really picking at it). The reason for this is because HEVC does alot to avoid macro/micro blocking and instead "smooths" over any hard artifacts and blends them in the best they can, this is also HEVCs downfall compared to AVC since at higher bitrates, AVC will maintain fine details clearly, while HEVC will still employ smoothing that causes a fairly extensive loss in fine details at even more than just the pixel level (hevc is known to apply smoothing/blending over a fairly sizable portion of pixels in a group, and understandably so since HEVC is designed to handle 4k and 8k resolutions which employ a shitpile of pixels in comparison to 1080p and below).

Attached: 142281185553.jpg (1018x683, 167K)

I don't care about your technical block of text, I encode that shit, I compare the result side by side looking at the same frame on a giant ass monitor, if the x265 version doesn't look worse I keep it.

There's a difference between what you think looks the same and absolutely no quality degradation.
Does it matter for YOUR videos? No. Do whatever you like.
However there is a difference and people who are used to do high quality encodes can tell. Again, do whatever you want with your videos, but the claim of absolutely no quality degradation is not true.

They don't really hardware accelerate video codecs on CPU cores. They use GPUs for that.

This

bump

>your technical block of text

Attached: really_.jpg (750x746, 524K)

Why are releases either 60GB or 3GB? I would like something in between.

Because you need AT MINIMUM an 8-core processor + exotic cooling that costs an arm and a leg to encode 1080p 8-bit x265 on the slow preset for a low 16/22 CRF. Fortunately AMD changed that BUT most people still feel uncomfortable shelling out more than $100 for a CPU so we have to wait for zen 2 to plummet the price of zen 1 further before x265 becomes popular.

bonus: pic related is so efficient it's been put in a laptop and can be perfectly cooled with the stock cooler up to ~3.6 GHz turbo on all cores.

Attached: Screenshot_2018-12-19-11-59-54(1).jpg (720x883, 151K)

>Xvid was fine
WOW

Does anybody here have experience with AV1 at high bitrates? I know it beats AVC's lossless intra compression, but how does it compare in general?

Why would you pick a proprietary standard over superior VP9? I think humanity is moving past the dark age of shitty standards enforced by a corporation that happens to have a monopoly at a given time with their shitty licenses. VP9 is good as a codec, not good because some company shoved it down my throat.

THIS. 2020, the year of fast HD video streams.

> before x265 becomes popular
You mean AV1.

>Xvid was fine
If you enjoyed VHS yeah

>cisco

>Why would you pick a proprietary standard over superior VP9?
Because there's a usable non-commercial encoder for it. I'd still love to see VP9 support on Jow Forums (or rather them stop blocking WebMs with VP9 stream). But for high quality encodes? No, thanks.

VHS quality and 350 megabyte TV episodes with shitty 340p resolution. You couldn't even write an entire fucking season on a DVD. I do NOT want to see another xvid video ever again.

Right, we just have to wait for AMD to release 1024-core processors clocked at 10 GHz on all cores for around $100. No biggie.

encoding isn't cpu accelerated?

It's asic accelerated (very efficient) or directly uses gpu cores (inefficient but still fast). Both still export video of the same quality as cpu encoding except the bitrate is 8-10X higher rendering it useless for rips.

Attached: cry.jpg (800x522, 47K)

yea ik gpu encoding is fucking garbage so I've never done it.
SO I guess CPU upgrades won't really make a difference other than moar cores and moar ghz memes, kinda what I was wondering. Thanks.

What does that have to do with boarding other sea vessels without authorization and taking all of their loot by force?