Preferred backup method on Windows

How does Jow Forums prefer to backup non-system files/drives on Windows?

The goal is to create an offline back-up of an entire drive. The system isn't installed on this drive. Options I currently recognize:

>Third party proprietary cloning software such as Acronis and Macrium
I'm unsure about this route due to vendor lock-in and proprietary file formats that are not ubiquitous. Could be annoying having to restore the backup if access to the program is somehow lost. I'm also not sure if I want to pay for something I can do for free or how well these programs work in comparison.

>Windows Image Backup
Creating an image makes a .VHD file which as far as I'm aware isn't compressed. This also doesn't seem suitable for non-system backups as the imaging is not file based.

>Windows Backup and Restore
This does compress the files but it seems to be rather slow and not suitable for large amounts of data.

>Windows DISM or ImageX
Creates a .WIM file which is file based and also supports automatic verification of the data, something I'm not sure even some third party options support. Free, seems to be fast. File format is proprietary and I'm not sure if all third party options allow easy restoration from these files. Restoration may require DISM.

>File archivers like 7-zip
This is the poor and lazy man's way of backing up. Relatively straightforward, no long installation process required, no need to learn commands. Can have a high amount of compression. The only issue I see with this is that it doesn't support verification/validation of the data, so no way to make sure the backup was copied correctly.

Attached: 32564946.png (850x464, 508K)

Other urls found in this thread:

kb.acronis.com/content/16791
clonezilla.org/clonezilla-live.php
duplicati.com/
twitter.com/AnonBabble

Just expose your computer to the internet and let NSA back it up, whenever you need to restore just post somewhere that you plan a terrorist attack and retrieve your backups when presented in court.

But user, you won't have access to your backups in prison. Unless you live in Norway or something

rsync in the wsl

Pray.

I don't quite understand why you'd need to compress the stuff, you are not going to get much space unless it's a large volume of something easily compressible anyway.

>using Wangblows
I'll bet you use Android too.
Fucking retard

Acronis states a reduction to 60-70% of original size for typical files

kb.acronis.com/content/16791

This seems pretty efficient. 7-zip probably has comparable compression.

Install crashplan and go back to sleep

Backup data files/folders (media, spreadsheets, game saves, etc) and export certain settings. Reinstall everything else when needed. Do not compress anything. Do not use containers.

>Do not compress anything.
Why not?

That's marketing bullshit. Modern programs have most of their data compressed already, especially if you exclude system files.
Compression is not worth it nowadays.
Honestly the best way to make backups is to copy all your folders to an external drive every time you want to make a backup and then use a tool to remove duplicate files based on content (not timestamps).
Anything else is either buggy, risky, or expensive in terms of hardware (such as ZFS).

Not him, but because it's a waste of time and if you lose a single bit you lose the whole file. What you might want to do though is create redundancy data with multipar.

So what you're saying is good ol' robocopy will do the trick? What about verifying the copy?

>What about verifying the copy?
md5sum

GoodSync.

I like to see which files have changed, what's about to get deleted, etc.

There's a trial version so you can see how it works.

>Not him, but because it's a waste of time and if you lose a single bit you lose the whole file.
Is this necessarily true with modern cloning software?

>robocopy
I'd do it manually through the file manager.
Automatic backups tend to fuck up in ways that are hard to detect until it's too late.
>What about verifying the copy?
Multipar not only hashes the files, but it provides redundancy data that allows you to fix the file in case it becomes damaged at any point in the chain. Make sure to make the par files before transferring the files to the backup media though, and to verify them after making them, because there could've been spurious errors in the generation process.

>Is this necessarily true with modern cloning software?
Yes, unless you add a recovery volume.
Now, notice I didn't say exactly what I meant by "file". If you use per-file compression, you only lose the file which had the flipped bit. If you use solid archives, you lose the whole backup.
If you decide to go with the "recovery volume" approach I recommend WinRar (you can get a license file for free on Youtube), or make the archive and then create a parity file (which is basically the same as a recovery volume, except separate from the actual archive) with Multipar as I already explained. Avoid Quickpar, it's buggy as fuck. And make sure that you set the number of blocks in the GUI to a high value, otherwise again a single flipped bit makes the whole thing useless.

clonezilla.org/clonezilla-live.php

I usually just buy another drive with the exact same size as the drive I want to backup, becauase even with compression I won’t get another backup on the drive. Literally no sense as a normal person.
If you backup 100tb+ it will of course make a difference, but not if it’s just a 8tb drive or some shit.
Also, compression takes YEARS and strongly depends on what you want to compress. Don’t even try to compress video files, not worth the time spent. It of course can make a difference, but compressing and uncompressing takes a huge amount of time, the better the longer.
Just copy shit. If it’s your OS drive take Clonezilla, if it’s just a data drive copy the files onto it. No software needed.

>go through files
>decide what I give a shit about and what I'm sure I could obtain again easily
>buy shitty 50$ HDD
>ctrl+c
>ctrl+v

Attached: b49.gif (652x562, 626K)

My experience with these programs that try to guess which files have been moved and so on is that they don't work very well if you change the directory structure of your data. And if you do it by timestamp and size you can never be sure that they're the same file anyway (corruption could have occurred on either end). Doing it based on md5 could work, but it'll be slow as fuck, so at that point you might as well copy everything over and run the duplicate finder afterwards (possibly on a separate computer), with the benefit that you get kind of an incremental backup that inherently shows what data has changed (files that haven't changed are removed from the older backups), and also if an error happens in transfer then you still have the older copy of it that won't get deleted by the duplicate finder.

>turns out half your files have been encrypted by ransomware
>you don't notice so you go ahead and update your backup
>now the files in your backup are encrypted too
wow, nice backup you have there faggot

That is only a good idea if you have like 4 times as many drives for backup as you have for your main data storage.
Otherwise this could happen

On macOS I'd say just use Time Machine which works great because it creates a hardlink forest on your remote, so data is easily accessible for anything that can read APFS drives.

For Windows or cross-platform, try Duplicati. duplicati.com/

Does it detect file moves based on timestamp+size or hash of the contents? Or some other method at the OS level?

duplicati

Only if you’re dumb enough to have Windows and to keep your BACKUP DRIVE always connected. And thats retarded.

I don't think you understood what I meant.
Even if the malware doesn't touch your backup drive, if you don't keep incremental backups far enough back in time, you could inadvertently overwrite your good backups with the encrypted data coming from your infected computer.
Same thing with corrupted files that you don't access too often so you don't notice the corruption.
That's why you should keep a couple backups going exponentially back in time. For instance, one that is 1 day old, another that is 1 week old, another that is a month old, another that is 6 months old, and so on.

Just sync your drive with Dropbox/Onedrive/iCloud/GDrive/whatever, no need to do anything manually, we are in 2019.

See Not to mention the privacy and risk implications of trusting a faceless corporation not to terminate your account for no apparent reason (for instance if their ML systems flagged you for copyright violation or whatever), or the service being terminated. Granted it's probably about as likely as your house burning down or other shit happening with your home made backups, even if you store copies at different places.

RAID array + NAS copy (also in array), disconnect NAS and move to secure off-site storage locker end of day

Is that Stink Bomb?

Ctrl+c in data, Ctrl+v in data backup.

> hurr durr I’m retarded autist
There are owncloud/nextcloud/etc. autistic software for you, you can sync your data without internet and without any corporation

You can use MEGA, they have encryption and shiet, and you can keep/restore older file versions if you enable the option.

>b-but you need to upload to the cloud!
Ask me where did I find your picture.

Attached: 1525812247385.jpg (213x237, 9K)

I'm not against storage services, just don't rely on their software to do the encryption for you.

> saw cloud in name but does not google it
You no need any picture brainlet

I already know they are self hosted apps you fucking sperg. Just because something has "cloud" in the name doesn't mean it's better than the alternatives.