What is the most efficient compression algorithm these days?

What is the most efficient compression algorithm these days?

Attached: 1180w-600h_051018_fantasia-mickey-short-780x440.jpg (780x440, 76K)

Other urls found in this thread:

mattmahoney.net/dc/text.html
encode.ru/threads/2829-RAZOR-strong-LZ-based-archiver
en.wikipedia.org/wiki/Xz
twitter.com/SFWRedditVideos

>Asking a question this retarded
It's like asking what is the best video language: it depends.

Though if you want to compress the followings things, these algorithms are prety efficients:
- Log files: LZMA2 (lossless)
- Music: FLAC (lossless) mp3 (lossy)
- Images: PNG (lossless) jpg (lossy)
- Generic stuff (json, html doc, c/c++ source code, plain text, ...) : LZMA2/GZ/BZIP2 (lossless)

But what is the best video language?

Those archive formats are less efficient than 7z

Programming* not video

OP said compression, not archiving.

7zip is not a compression format in itself but rather another archiving format.

It's PHDA9

Sauce: mattmahoney.net/dc/text.html

"Which is the most efficient" and "Who compresses more" are different questions
This phda9 you mentioned is the most compressive, in its own list of compressors that you sent, but it is not the most efficient, going down a little more you see the durilca'kingsize that had a final file of 16MB, while the PHDA9 had a final file of 15MB, but the PHDA9 took 85877 seconds to compress and 86365 to extract, while Kingsize took 1398 to compress and 1797 to extract.
In fact none of these two and none of this outdated list is the most effective currently.
The most efficient for uncompressed text and binary / library files is RAZOR encode.ru/threads/2829-RAZOR-strong-LZ-based-archiver

Google photos compression. Seriously, upload and download a picture and then compare with the original. As long as it's not fucking RAW, you will never be able to tell the difference.

its lossy you idiot.

Pied piper

Attached: 284.png (284x272, 24K)

proprietary trash

MD5

it just werkz

The best compression algorithms are within spitting distance of eachother so it really doesn't matter anyways.

I had this idea aswell but it cannot work because of collissions.

>mp3 (lossy)
What? MP3 is good for legacy software but far from being the most efficient.
>PNG (lossless) jpg (lossy)
No, those are only very popular. Webp is more efficient in both categories. FLIF is even better for lossless. See

What is the best lossy compression for my source code?

>I had this idea aswell but it cannot work because of collissions.
It might actually work if we get a solution to P = NP problem.
But you would need to store way more than a single hash to make it work. i.e. a reliable way to confirm if the reverse hashed file is the one you want

Set N to 1

P times one is P

wow, cmi better give this genius guy 1000000$ rn

>- Generic stuff (json, html doc, c/c++ source code, plain text, ...) : LZMA2/GZ/BZIP2 (lossless)
lmfao it's 2018 you stupid xz exists

>t. just watched Silicon Valley

Faster than and uses less memory than cmix
Amazing

From Wikipedia, the free encyclopedia

xz is a lossless compression program and file format which incorporates the LZMA/LZMA2 compression algorithms

en.wikipedia.org/wiki/Xz