ARE YOU READY FOR THE INFLUX OF 265 HEVC 10BIT ANIME ABOUT TO GRACE MANKIND?

I for one welcome our 265 overlords

Attached: h.265encodedecode4k.png (726x349, 340K)

Other urls found in this thread:

deliciousslurper.blogspot.com/2019/03/8k.html
kodi.wiki/view/Android_hardware
mpv.io/manual/master/
twitter.com/SFWRedditGifs

Sorry, honey, in their neverending quest to make hardware-accelerated decoding impossible, The Powers That Be are already using 12bit HEVC.

HEVC a shit. AV1 numba 1!

>12bit HEVC
Nobody on nyaa does this

Attached: 134532532.png (649x516, 535K)

Daiz...

VP9 and HEVC make my computer burn, who thought that was a good idea.

>hevc2

Why there's no VP9 encode?

None of these new encoders are ever going to be used for anime because it doesn't make sense. H264 has unbeteable encode and decode times, and for anime there is barely a difference in output size and quality since it uses such a simple colour pallette and is made in such low resolution, and that won't ever change.
The only difference is that hiroshimoot might allow AV1 webms when they're popular enough.

>The only difference is that hiroshimoot might allow AV1 webms when they're popular enough.
Dude, we still don't have VP9.

Okay grandpa.

10-bit H264 is shit though because barely any chips can decode it in hardware. HEVC 10-bit profiles on the other hand are supported by plenty

No. The selling point of h264 that's it's effectively ubiquitous(especially with the low profile) and pretty efficient.
Trading that for the up to the 15% less space to get the same fidelity doesn't look like a good idea to me.
Maybe when 265 will become as widely spread as 264.

>15%
You mean 40%-50% mate. You got confused there for a moment.

The fansubbing cabal still can't rip the 4K/8K TV from Japan so I wouldn't keep my hopes up.

I'm sceptical about the actual nubers, but my point still stands.
Space if dirt-cheap now.

>Space is
fix

Attached: 1542233847004.jpg (236x236, 14K)

av1 is shit
>why yes hello sir you have a low bitrate source?
>allow me to dnr it until it's a series of occluding blobs
>then apply a blur filter to it
Perfection.

>skeptical
You don goof

Attached: youconfused.png (909x121, 21K)

Most of those are rips of rips that were there any justice in the world the encoders would be summarily and publicly executed for their crimes. Anyone downloading gets 5 years gulag.

PRAISE 265

Attached: 265.png (569x390, 32K)

Only the dead can know peace

Attached: not265.png (475x383, 28K)

I just want 60fps anime so my eyes stop jittering left and right during panning shots

That's more like 55%-60% less space

The correct way to correct from jitter is to have a multisync or 120hz display.
I had a faux 120hz tv for years and I deeply regret not replacing it many years earlier than I did.

I have no idea what you people are talking about

I-I know I just don't want to buy a new monitor.

Those Cleo encodes look like shit though. I know because I accidentally downloaded one.

Nobody cares about some video codec, fucking nerds

Panning shots look like shit even with a non-retarded refresh rate. It's just an inevitable consequence of 24fps and no motion blur.

>8k
Anime has barely any content in 4k, and the perceived difference to 1080p is not that much anyway. Anime should focus on implementing HDR or DV/HDR10+ formats insteads, which is a massive leap in quality.

user, I could make h264 smaller than h265 if I bitstarved it. Size doesn't mean shit without comparison screenshots.

anime has barely any content in 1080p. Most of anime is made in 720p, kyoani being the sole exception

Most encodings aren't even .h264 despite it being around 10 or so years. I've no faith this is going to be the primary method of encoding any time soon.

It looks ok. If you have a problem I'd suspect your tv isn't 120hz at all or your player is syncing to 60hz because you have more than one monitor so it's picking the failsafe.
Yeah a 72hz crt monitor will look better than a 120hz lcd monitor no matter what but the difference between 60hz and 120hz lcd is still night and day.

Most HEVC encoders bitstarve their shit to death, so you essentially have a streaming experience, making the entire batch worthless. HEVC can be better than h264, but all its encoders are incompetent.

>bullshit
>more bullshit
>praising his favourite shit tier studio and more bullshit
These posts are pretty pathetic. Maybe 20% of anime are still rendered at 720p. The rest uses a higher resolution.

I'm pretty sure it's working as intended (72Hz, not 120Hz though). Pans just look janky even without the juddering. Better than 60Hz for sure but annoying nonetheless.

Attached: Untitled.png (752x156, 236K)

More like 90%. It's either 720p or some non-standard abomination like 864p/900p.

Wow, so it's either 720p or one of a hundred possible resolutions above 720p. Woah. Did the Ministry of Encodes tell you that every resolution lower than 1080p is 720p, retard? Besides, most bluray releases are 1080p, same with movies. So shut your whore mouth you retarded KusoAnus mongrel.

That higher res still isn't 1080p though.

480p is all you need

>Most of anime is made in 720p, kyoani being the sole exception
Read your own posts, KusoAnusNigger.

>most bluray releases are 1080p
Upscaled 1080p

stop making sense on my techmeme thread

Attached: 1508258713340.jpg (1068x720, 88K)

>Size doesn't mean shit without comparison screenshots.
Comparison screenshots mean shit too.
You can make a bitstarved video with decent stills but any scenes with lots of motion will turn into a complete mess (see YIFY encodes).

265 is garbage tho

SASUGA AMD

Attached: 1560688674235.jpg (1021x503, 168K)

deliciousslurper.blogspot.com/2019/03/8k.html

Can you play 8K video straight from Japanese TV?

>AMD

Attached: 1553111563479.jpg (640x480, 78K)

This proves nothing. I can encode h264 with the exact same size and resolution.
What I need to see is the research which covers encoding sources of different types with side-by-side comparison, resulting video samples and preferably at brief analysis of the playback options and the resulting video stream.

brainlet here i have no fucking idea what h246 , 265 , 10bit 8bit , whatever the fuck ANY of these terms mean
when i download a batch i sort by most downloaded and go with what everyone else downloaded

Do we even have 10bit on mobiles yet?

Mine can do it for H265 but not 264

redpill me on 10bit, what even does it mean

based sheep

Bigger palette of colors

Hardware or software decode? I have MX Player shitting itself on sw 10bit and mobile vlc plays it better but it is still awfully taxing.

Why has Daiz forsaken us?

You could but with worst quality so why even bother?

essentially better quality at the same file size as an 8 bit encode (but you get diminishing returns with larger files)
but you won't be able to play them well on weak devices like mobiles, smart devices, etc.

Hardware

Fuck off Daiz.

me neither but it's nice to learn something new

What kind if hardware should I look for in my next device?

>worst quality
This is what I'm talking about. This poster implies that these files have the same video quality.
It's not obvious for me at all.

Kodi wiki has a list of mobile chipset that support HEVC (and they note if it's 8-bit only):
kodi.wiki/view/Android_hardware

If you want H264 10-bit then you have to get a Tegra AFAIK.

It's been supported on fucking Atom since 2015.

Attached: 4L_Gvr3xuo3.jpg (1100x621, 221K)

I have an Oculus Go(which is Android). Can it encode it? If yes, what about energy consumption? What about my shitty cheap Android tablet?
What about my iPhone?
I bring that up because the energy consumption and heat are very important in these cases.
Also
>Skylake
so no luck with my Ivy Bridge or Haswell desktops, I assume?
Playstation 4 is out of the question, I assume, let alone 3.
So the only processor that I have which supports it is 8850H.
And even the semi-modern toasters support h264. All that to save some space, which is dirt-cheap nowadays.

you fucking retard. your image is braindead zoomer tier. you have no clue what so fucking ever how many settings you can tweak to affect file size and image quality for both avc and hevc.

you clueless curry nigger go and make some more phone unboxings on youtube. let the adults talk about things you don't understand

re-encoding 8bit into 10bit it gives the encoder (h264/h265) more headroom for accurate calculations (it does not advance to color spectrum, like you would get if you captured a video yourself in 10bit), meaning, if the bitrate would be the same for 8bit and 10bit, the 10bit counter part will have a better image quality.

now, if you can tell the difference with your own eyes, and not rely on some artificial image algorithm that checks the quality loss directly compared to the source file, that would be another story.

The support for it, however, is not nearly as widespread as the support for the 8-bit counterpart. Especially hardware.

DAAAAAAAAAAAAAAAIIIIIIIIIIIIZZZZZZZZZZZZ

theyre probably re-encodes

YET

I stream my animes from chrunchyroll.

kek

Great, where is HDMI 2.1 tho?
I need my HDR in 4K 4:4:4

At what refresh rate? Pretty sure you can get HDR 4K with no chroma subsampling at >60Hz via DP1.4.

Already here
Some TVs have it, it’s the GPU makers that haven’t implemented yet

TVs don’t support DP
>just use a monitor bro
Monitors are nice and all but TVs basically give you double the screen size for the same money while still putting up decent numbers in reviews

Are game even ready for Dolby Vision and HDR+? Not some basic HDR with fixed HDR metadata.
60fps min

Attached: file.png (568x509, 35K)

They're not really intended for the same use case. TVs are bigger because they're meant for greater viewing distance. Monitors are meant for use at a desk and at that kind of viewing distance a monitor can easily cover the same area of your cone of vision as a bigger TV that is placed much farther away.

The thing is that you won't really have any problem with chroma subsampling on a TV though, at least not in its intended use case. Not only will you generally be too far away to notice the loss of quality in fine details but more important is the fact that most video is already chroma subsampled anyway, so you're not actually losing anything by viewing it over a chroma subsampled video connection.

Buy DP1.4 HDR monitors, they exist and will actually do more than 60Hz, up to ~100Hz or some shit like that, don't remember exactly.

AV1

Why do we even need hardware decoding? On PCs that are not potatoes.

what toaster you got? my n3050 (netbook) can at least hw decode 8-bit HEVC, and can almost keep up with 10-bit by software decoding.
Also has VP9 acceleration.

Why would you want to waste CPU time and power on a common task which can be much more efficiently handled via hardware acceleration?

Because I payed for those cores to actually use them.
Hardware decoding prioritizes speed over quality.

Navi's uArch is channeling Maxwell I see.

>Hardware decoding prioritizes speed over quality

Pretty sure hardware decoding does not change quality at all unless it's used to apply some sort of post-processing or scaling as well (which shouldn't be used), it's the same bitstream and it's not like it can be decoded to different pixels.

Since when the fuck did the nips stop broadcasting everything in 720p?

mpv.io/manual/master/
>Quality reduction with hardware decoding
>In general, it's very strongly advised to avoid hardware decoding

>what is copyback

except it hasn't and it doesn't work

>Believing the lies of MPV

KILL YOURSELF FAGGOT

Calm down, vlcdrone.

I don't know, taking screenshots with HW decoding enabled and disabled at the same timestamp literally produces files with the same checksum, so you're free to believe whatever you want I guess.

But muh AV1.

Attached: 1552064189484.webm (1280x720, 2.9M)