What is the best way to organize data across multiple hard drives...

What is the best way to organize data across multiple hard drives? I need a system where the hard drives are unplugged and offline but I can still browse all the files across all the drives, and then just plug in the drive needed to access a specific file as and when.

Attached: pileofharddrives.gif (375x221, 50K)

Other urls found in this thread:

winworldpc.com/product/neverending-disk/2x
twitter.com/NSFWRedditGif

Bump, that's an interesting idea.

write a program that copies names of folders/items when double clicked to a txt file

I'd also like to know if something like this exists, ideally without storing everything in some sort of container file but instead storing files normally on each disk and perhaps having a complete, browsable index on each drive or some such.

Personally I gave up on it and simply sorted files based on their type to different drives (for instance one for TV shows, one for movies, one for system backups and programs, etc.)

> write a program
So, interestingly enough, I couldn't find any software. ls does not have a tree mode. tree dos not show the file type. Any ideas?

winworldpc.com/product/neverending-disk/2x

Considering that even the Windows cmd offers the option to write all files to a txt files, I'm sure there will be some terminal equivalent on Linux to achieve the same (if you happen to use it).
Just google the command, since I regularly forget it, but that's how I organize my files across 5 hard drives. One txt for each of them.

No need to write a program.

The only thing that does something similar to what you're describing is MHDDFS.

So you just want a database with all the files names and attributes on your drives?

ls -r -l

There's a search program I used to use in the Vista era called Everything. I think it has manual indexing. I just remember it being fast as fuck.

sounds like how catalogs work for C1/LR. Takes your large files (Raw images), turns them into a small nondestructive preview. You can edit the previews without connection to the drive that housed the images themselves.

find / -type f > myfiles.db

I have just the thing for you, written it myself. It indexed all harddrives attached that follow my archive naming scheme, list the topmost directories (with some exclusions I defined) and pipe the list to gpg. The program crawl, that I also wrote, decrypts every list and loads of into memory, then filters for the search terms you gave it or, if desired, applies a supplemented regex.
I had something like
said before, but it honestly sucks compared to a proper indexing system.

But what about dirs and symlinks.

Cathy

>Cathy

That's good 16-bit software but it don't work on 64-bit Windows sadly

RAID

>What is the best way to implement my stupid idea?

name every drive
create links
look up link if you dont remember what drive to connect

here you go stupid

Attached: dfg.jpg (500x376, 29K)

Write a script or program to traverse your file system(s) and store the hierarchy in a database. You'd have to make the script/program able to check if it is physically interfaced/mounted with the devices, or just has the cache of the file system structure. Doesn't sound all that complicated, desu.

I'm using on win 7 64bit for years. Are you on win 10?