.rar

>.rar

Attached: e75.jpg (251x251, 4K)

Other urls found in this thread:

en.wikipedia.org/wiki/NZB
twitter.com/NSFWRedditImage

>.war

Attached: 1497863995549.gif (500x370, 1.88M)

>.tar.gz

freetards must hang for this

>.par2
>Need more blocks

Attached: 1531224004496.jpg (586x751, 174K)

It's a tar compressed with gzip. It's perfectly logical.

I make little T-Rex hands when I say "Raaaaaar" to people.

>torrent has many files of already compressed media in a rar
why are scenefags so incompetent?

Don't the distortions usually happen from top to bottom?

This is so retarded, especially for games, where there's like an hour of decompression time.

Why? It's a perfectly sane format, allowing you to preserve permissions even after compression.

.dag

Attached: Hiroshimoot.png (378x288, 107K)

>.kgb

>two extensions
dumb spergs

>.dag

Attached: GARTH1.png (400x307, 169K)

I remember downloading some "Linux distros" once, and it was literally 4 nested .rars of 50 parts each.
Scene faggots are truly, truly brain dead. They do it that way because that's the way it's done. They don't think about it ant farther than that.

Split-up RAR files (plus associated PAR2, in case some parts already are beyond retention time or otherwise lost) often originate from Usenet, where certain servers may have a rather small size limit for submitted posts. It's mostly a non-issue with scene FTPs where files are only limited by the filesystem, and even if gigabyte-sized chunks are much more common there.
Whatever webkiddie created the torrent didn't just bother at all to unpack those parts (mostly to be the first to upload a torrent), but also wrapped it in another archive.

>Lt. Tasha .rar

Attached: tasha.jpg (640x480, 33K)

>.isz

Attached: 1515574453780.png (730x844, 137K)

>having to use tar to preserve metadata

Freetards actually believe this.

>literally anything except .iso

Attached: image.jpg (220x293, 20K)

bin+cue master race

that must have been long, long ago. nobody's using usenet to distribute anything these days - and usenet is the only good reason to split a .rar file into 200 pieces. since you're clearly too young to remember: this is the way it had to be done back in the early 1990s. and it worked just fine. also, it had the added advantage of making it easy to put things on floppies once you had downloaded the dozens of files needed.

you could do it other ways but using tar works fine. it's also got the advantage of one tool which can be used with various forms of compression like gzip, bzip2, lzma and so on. it's totally logical to have one tool handle multiple files, file permissions and ownership.

yes, I know these are concepts hard to grasp for wintoddlers who aren't used to files having permissions and user and group ownership...

Attached: downloadfile.jpg (720x1004, 58K)

People are still paying access to private Usenet servers that exclusively carry binary newsgroups, you know. Together with NZB descriptor files you can find on the web (that only point to the Message-IDs of required parts), it's a bit like a centralized BitTorrent.
en.wikipedia.org/wiki/NZB
With most GUIs provided, normal users may not even realize their downloads are split up into a lot of pieces and unpacked afterwards.

Why not something modern and efficient like .xv for example, opensource and with advanced algorithms

too soon

You know what always pissed me off?
My fucking macbook which decided to make unarchive the standard program for opening .tar.gz, so far so good, however it was too retarded to realize that the .tar.gz archive was meant to be unzipped so what it did is compress it again and out came a .tar.gz.zip file.
Completely autistic shit program.

Attached: 601.jpg (657x527, 27K)