Why doesn't windows or any standard linux file manager have a field that displays the size of a folder's contents...

why doesn't windows or any standard linux file manager have a field that displays the size of a folder's contents? i know it stresses the drive to index, but surely there is an easy way to do this, no? writing the sizes of my folders in renames helps me prioritize what i need to clean up / archive / zip / delete for disk space first

Attached: Capture.png (599x253, 21K)

Other urls found in this thread:

i.imgur.com/XNqsYJm.png
youtu.be/dFgneL5X3ag
myredditvideos.com/
twitter.com/SFWRedditImages

maybe you should consider *what* it is you are storing, and whether it has value to you, other than the raw size of the files

'system32' is 5kb larger than 'pictures of bill murray', better delete system32. g why is my computer broken

if my downloads folder takes up half the drive, then im going to go and do garbage collection on that fucking folder.
i never delete files i actually need. I data hoard entire youtube channels, and usually have many shitty copies just in case a video gets taken down before i properly archive it, or download music in mp3's before i dedicate space for something to be worthy of a FLAC.
i sometimes forget to delete the inferior copies of these files because im lazy, and sometimes i keep them around if i cannot find a good master in flac. quite a few albums i have used actually sounded better in the shitty opus rips i did, not because of the bitrate but because of superior mastering.
also, files that are taking up space but are important go on my archive drive.
im not a dumb dumb user, im just smart with my time managing files. i can potentially clear more out of a 40GB folder than i can out of a 2GB folder, because it is likely i am going to find more garbage data in a bigger folder. fuck off.

Windows 10 Settings->Storage shows data sorted by size.

>prioritize what i need to clean up / archive / zip / delete for disk space first
I feel you bro. Have you ever used WizTree? It's great

Attached: 1552755595747.png (800x473, 29K)

This has been a question I've had for quite some time, too. Surely if the MFT is all written in contiguous space on a disk and it can be read sequentially into RAM and then sorted, it shouldn't be that hard to calculate this information. If anyone with filesystem development experience can shed light on this that'd be gr8/8.

scanner2 for windows, look it up

I'm pretty sure I saw some on Linux. But what I actually use is just ncdu.

You probably want Windirstat or something on your Windows.

>siivagunner

It sounds simple but it would be too slow. There's also no easy way to store all these sizes and recalculate each time a single file changes. Just right click->properties

Can you provide the logic you have to come to that conclusion? How are filesystem journals/tables/databases structured that makes this 'too slow'?

>There's also no easy way to store all these sizes and recalculate each time a single file changes.
Actually this is pretty simple with Linux' inotify.

based

windirstat

i have it on my pcmanfm file managar
i.imgur.com/XNqsYJm.png

I just right click the folder and hit properties, if its not a huge folder it'll instantly show me, and it's not like I do this very often.
Rather just eat the 2 clicks instead of having my computer shit itself every time I open my D drive.

>folder contains 1000's of files
yeah fuck that, bye bye disk life

The only way of consistently doing this would be to recursively look for everything inside the folder and sum it up, that is a gargantuan task that bottleneck the performance of the file manager.
Imagine opening your file manager to see some file in a USB 2.0 drive and having to wait for every file inside of it having its size summed up just to show the total size of each folder even if you don’t care about that information até that moment

r8 my workflow

Attached: vifm.webm (640x720, 510K)

>copy file to folder
>windows updates an attribute of the folder its copying to, adding the files size to the total size attribute

>delete file from folder
>windows updates an attribute of the folder removing the size of the file being deleted from the total size

Not hard to do avoiding disk i/o every time a folder is accessed.

Not like the shitty default file attributes windows uses, like tags, resolution, date which all require reading the header of each file every time a folder is accessed. So fucking inefficient.

Just change the columns to show size in the view. Ez

See And also learn how computers work.

Except you have to update the size for all of the parent folders as well. And also every time a file is modified you have to update the sizes of all folders and parent folders. And also if multiple files are being used at the same time you need a mutex to make sure only one file can be saved at a time so they don't conflict with each other updating the sizes of the parent directories. Oh wait, all files share the same top level directly, so now you can only save one file at a time.

Good job user, you're hired as an OS engineer.

Attached: mount-stupid.jpg (739x493, 39K)

you can just store a total size in the filesystem, its a really easy operation if you do it every time you add or delete a file, just a single sum.

Now every time you write to the disk every parent folder in the file system tree needs to recursively find every file inside it and sum its contents to update the index, and that would also slow the system to a crawl by a simple Ctrl+S in a text editor

>Except you have to update the size for all of the parent folders as well. And also every time a file is modified you have to update the sizes of all folders and parent folders.
are you implying that's a difficult task to perform? because it isn't

>6.2 GB re:zero
imagine not having the 1080p blu-ray rip

Attached: 1550638587050.jpg (225x225, 9K)

Now imagine a multi-tasking system, where there are hundreds of processes reading and writing to disk simultaneously, all doing the same process of asking the Operating System to recount every parent folders size.
Also for example a large file being downloaded in a fragmented way, saving to disk multiple times a second.
All this extra work just to show a number in the file manager.

Okay dipshits, if you have an efficient algorithm to recursively calculate a folder structure, fucking apply to Microsoft instead of bitching on Jow Forums

because it's fucking computationally expensive
windows does in its tool tips though, mouse over a folder once and wait a bit before mousing over it again

There's no need for a mutex there bud

I'll have it done in a week

you are updating maybe 5-10 folders on average when you finish using a file stream
that's trivial

Yeah, you just increased the system I/O/disk seek load by 500% to 1000% in the average case, that's trivial.

>windirstat ;^)

imagine being so poor you cant afford a hdd lmao

Imagine having taste so bad you think Re:Zero is even worth archiving.

Attached: e6566a69bc0820f02750eba1f68610a476b2f55601dc1dca1d5aa2789026a3a4.png (825x707, 648K)

just invent faster hard disks then so it doesnt matter

>filthyfranktv
>idubbbz
>sii/v/agunner
If you know the words, you need to shoot yourself.

Literally.

There are duplicate filechecker softwares.
>I'm smart with my time managing files
What you're doing isn't worth your time.
Download pic related if you want to sort by size.

Attached: windirstat-screenshot.png (1600x900, 1.26M)

>scrolling down to size instead of using hotkey
>saving anime on his "disk3"
>having goblin slayer
>having toradora take up 9GB instead of like 800mb

>writing the sizes of my folders in renames
Good god, how autistic can you get.

Just change the physical constants so it's possible to make faster hard drives, easy.

wait what?
almost all system folders in Windows and Linux have 1000s of files inside.
What are you saying?

Or you send a delta upwards. The path character limit in Windows means that you can't be more than, what, 80 directories deep in the worst case?

youtu.be/dFgneL5X3ag

This except that program is eye cancer so download spacesniffer.

du -sh

ncdu