Which encode would have better quality? (Assuming competent encoders/similar settings)
H264 encode vs H265 10bit encode
H264 remux vs H265 10bit encode
H264 vs H265
"Similar settings" is a nonsense term if you're using two different encoders and/or two different standards. Beyond encoder, settings, standard, decoder also plays a role.
The h.265 standard exposes more features, some of which require additional computational power. h.265 has greater potential for increasing the quality:filesize ratio, though you can assume most of that comes at the cost of additional computation.
Your post is retarded. Both can be 10bit though 10bit AVC is a literal meme.
A reencode is always lossy unless you specify otherwise therefore a fucking remux is always going to be better.
At similar CRF settings HEVC is supposed to be better but the gap narrows as you increase bitrate. This also means that at high resolutions HEVC is usually better due to the fact that you'll almost always be bit starved.
Which would be better for a movie PRODUCED in 2k?
4k h265 10 bit encode
1080 bluray remux
Something that is not lossy
2K?
As in?
2048x1536 4:3?
2048x1152 16:9?
H265 > webm VP9 > H264
All I know is that IMDB says this:
Cinematographic Process:
Digital (1080p) (source format)
Digital Intermediate (2K) (master format)
In terms of quality or efficiency?
Then it's stretched if they use 2048x1080
The standard is bastardized by TV "standards".
4K is still 4096x2304 for 16:9 and 4096x2560 for 16:10.
UHD """"""""""""""""""""""""""4K"""""""""""""""""""""""""" 3.8K is for some reason accepted even though it doesn't live up to the 4K requirements because muh TV "standards".
Thought for sure you meant 1440p and for some reason people call that 2K even though it's closer to 3K than 2K
>not downloading xvid
lmaoing at you are lives
>downloading xvid
Are you still using WinMX, DirectConnect or Kazaa perhaps?
>not knowing what a remux is
lmaoing @ ur lyf for not downloading 35GB bd rips.
Lterally takes 5 minutes of download tym
Quality isn't really a thing. h264/h265 both support ~lossless. I don't remember if vp9 does. Efficiency also has a couple axes to work on. Compression vs. Speed vs. Cost and on what source. There's also parallelism to take into account, effectiveness across multiple threads/cores.
I meant something like
1080p H264 remux vs 4k H265 10bit encode
muxes have nothing to do with encodes or codecs.
>CRF settings
Absolutely useless comparison as there's no correlation between the two encoders on this.
I mean you could totally compare crf 0 filesizes
And get results of a situation that's generally useless for the most intended use case of the two encoders.
Damn H265 is slow. While it looked to give around a 20% reduction in file size, it came at 30% of the speed of 10 bit H264.
Well that's going to be encoder-speccific anyway. You'll likely have options for throwing moar cores at it, too.
Does it take a day to encode 6 minutes of video? h.265 is a jackrabbit compared to AV1.
No, I got 1.5fps. Compared to 4.3 for 10 bit H.264 using some unnecessary settings. (12 ref when 7 would do, and merange 28 when 24 would do.)
I'm sorry, did you just say you only get 4.3 frames encoded per second with h.264 but somehow manage 1.5 frames per second with AV1? I think you might be missing a dozen decimal points.
No, that was 265 vs 264.
h265 is more efficient and smaller in size, not better looking
More efficient would mean better looking at the same file size.
Dunno much about it. Just that drives are cheap and my H.264/Xvid DVD rips look damn good on my 65" Samsung TV. But I sit like 8 ft away from it so maybe that makes a difference.
265 has worse blacks
no, means a 10 MB h265 will look the same as a 15 MB h264, that's efficiency
You can make H265 15MB.
>which encode would have better quality
the untouched Bluray rip
>H264 encode vs H265 10bit encode
>H264 remux vs H265 10bit encode
The first, in both cases.
H.265 will always be better than H.264 at the same bitrate because it's a more efficient codec, the only reason it isn't used more is because encoding is much slower and it isn't as widely supported.
That's not true at all. The x265 encoder is nowhere near as good as the mature h.264 encoders. The best h.265 encoder--netflix'--isn't open source, so people are stuck using x265. Encoders at private trackers regularly test new versions and it's still found to be lacking, although improving. Encoding speed and lack of playback options are two of the reasons that it isn't used, but they aren't the only ones and aren't even the main ones.
After a lot of experimentation here's what I discovered for myself:
1080p original content or lower resolution, primarily 720p: use h.264 (x264 encoder), it's incredibly solid, mature, it just fucking works and it does great.
Anything above and beyond 1080p resolution: look into using h.265 (x265 encoder) because that's what it's designed for. h.265 doesn't really offer any big space savings or quality improvements over h.264 at 1080p and lower, by design it is meant for the UHD resolutions where the encoding format itself benefits from the larger macroblock encoding.
I've spent enough time doing enough test encodes to not give a fuck anymore, h.264/x264 is just fine with me for 1080p and lower resolutions, I can even get good/great looking encodes using Intel Quick Sync with my Kaby Lake CPU nowadays.
I don't give a fuck about making perfect encodes for mobile devices, I don't have a 60" OLED display on a wall, or a projector, or a home theater setup, I have my ThinkPad and a smartphone and as long as I can watch it and hear it, that's fucking good enough.
Besides, I have the Blu-ray source for some stuff and then if I want a really high quality version I can always just snag something from a torrent or Usenet.
Worrying so much about this shit anymore, fuck that, it's not worth all the hassles.
No, I just looked at it and it needed more than 10-bit x264 to reach transparency at default settings. It was smoothing the grain. Only after adding a page full of options cribbed from VCB-Studios did it come out better.
Visual quality is not a linear thing. Something being significantly better at 1000kbps may be worse at 10000kbps.
*Something being significantly better than something ELSE at 1Mbps may not still be better at 10Mbps
>better quality
Do you even know what you're talking about?
There's nothing better than remux.
wth, op
>tfw you'll never get a high quality 100 MB ~25 minute anime in h264
Meanwhile high quality 100 MB 25 minute anime in h265 are abundant.
>100MB