There are plenty of utilities (GUI, such as filelight and TUI, such as dua as well) for analyzing disk usage by space, but I would like to view my folders based on the count of files, as I’m making backups, and folders with lots of small files (e.g. node_modules) take very long to move around, so I guess that I’d be better of compressing those into a single file before archiving, as it’s already highly unlikely that I’ll need to access them anyway. Thanks for any pointers in advance!

  • palordrolap@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    The find command could be your friend for getting a full depth count. Something like:

    find /path/name/here/ | wc -l
    
    

    Or just:

    find . | wc -l
    
    

    for the current directory.

    There’s also a command called locate (often with another letter in front, but accessible by just that word) which maintains a database of filenames on a system that can be used, provided it’s installed and has built that database.

    Pro: Faster than churning the disk every time with find. (Though disk cache can help alleviate some of this).

    Cons: Can get out of date with a changing filesystem. Harder to search for relative paths like .

    locate -r '^/path/name/here/' | wc -l
    
    

    … would be roughly equivalent to the first find example.