Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It’s been a better experience that du, which isn’t always easy to navigate to find big files (or atleast I’m not good at it.)

Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn’t bite me. Been pushing around 95% of disk space for a while so this was a huge win 👍

    • Aatube@kbin.social
      link
      fedilink
      arrow-up
      56
      arrow-down
      1
      ·
      1 year ago

      don’t worry, they’ve just been using neovim for 700 years, it’ll be alright

    • Nik282000@lemmy.ca
      link
      fedilink
      arrow-up
      29
      ·
      1 year ago

      So I found out that qbittorrent generates errors in a log whenever it tries to write to a disk that is full…

      Everytime my disk was full I would clear out some old torrents, then all the pending log entries would write and the disk would be full again. The log was well over 50gb by the time I figured out that i’m an idiot. Hooray for having dedicated machines.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        29
        ·
        1 year ago

        That’s not entirely your fault; that’s pathological on the part of the program.

      • Free Palestine 🇵🇸@sh.itjust.works
        link
        fedilink
        arrow-up
        13
        ·
        1 year ago

        I once did something even dumber. When I was new to Linux and the CLI, I added a recursive line to my shell config that would add it self to the shell config. So I pretty much had exponential growth of my shell config and my shell would take ~20 seconds to start up before I found the broken code snippet.

    • rutrum@lm.paradisus.dayOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      If you have ideas please let me know. I’m preparing to hop distros so I’m very tempted to ignore the problem, blame the old distro, and hope it doesn’t happen again :)

      • I would have to look at the log file. Some plugin probably has an issue and writes massive amounts of data to the log every time you use Neovim. Monitor the growth of the log file and contact me via DM if it goes crazy again, I’m gonna try to figure out what’s going on.

    • oldfart@lemm.ee
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      I install ncdu on any machine I set up, because installing it when it’s needed may be tricky

    • dan@upvote.au
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      1 year ago

      Try dua. It’s like ncdu but uses multiple threads so it’s a lot faster., especially on SSDs.

    • blarp@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      11
      ·
      1 year ago

      i use gdu. it’s written in go which sucks but it’s faster and looks nicer

        • blarp@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 year ago

          even tho ken thompson was involved in it’s development, it’s still a google product and i hate google. and if you don’t like that, then continue to feed their monopoly on the front-end of the internet…whatever bro

          • pete_the_cat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            So you hate a language just because who it’s associated with. That’s dumb. Go is an awesome language, I used it at work for 2 years.

        • ferret@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          18
          ·
          edit-2
          1 year ago

          Garbage collected languages will never not be at least slightly ick

          Edit: I did not expect this to be so controversial, especially in regard to go, but I stand by my statement.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            Counterpoint: I’ve never used Go myself, but I love that Go apps usually seem to be statically-linked executables. I have a few utilities (such as runitor) that I ‘deploy’ to my servers by just copying them into /usr/local/bin using Ansible.

            • pete_the_cat@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              Go is awesome, yet a slight pain in the ass some ways, but you get used to it. I was doing DevOps stuff with it for 3 years. I like it so much more than python.

      • anagram3k@lemmy.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Yes, it looks very similar. The guy of ncdu is making a new improved and faster version in Zig.

    • FigMcLargeHuge@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      But did he even look at the log file? They don’t get that big when things are running properly, so it was probably warning him about something. Like “Warning: Whatever you do, don’t delete this file. It contains the protocol conversion you will need to interface with the alien computers to prevent their takeover.”

      • damium@programming.dev
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        PTSD from the days long ago when X11 error log would fill up the disk when certain applications were used.

  • TheAnonymouseJoker@lemmy.ml
    link
    fedilink
    arrow-up
    46
    arrow-down
    6
    ·
    edit-2
    1 year ago

    MINI GUIDE TO FREEING UP DISK SPACE (by a datahoarder idiot who runs on 5 gigs free space on 4 TB)

    You will find more trash with the combination of 4 tools. Czkawka (duplicates and big files), Dupeguru (logs), VideoDuplicateFinder by 0x90d, and tune2fs.

    VDF finds duplicates by multiple frames of a video, and with reversing frames, and you can set similarity % rate and duration of videos. It is the best tool of its kind with nothing to match it, and uses ffmpeg as backend.

    There is a certain amount of disk space reserved on partitions for root or privileged processes, but users who create /home partition separately do not need this reserved space there. 5% space is reserved by default, no matter if your disk is 1 TB, 2 TB or 4 TB. To change this, use command sudo tune2fs -m N (where N is % you want to reserve, can be put to 0% for /home, but NEVER touch root, swap or others, use GParted to check which is which partition).

    Regular junk cleaning on Linux can be done with BleachBit. Wipe free disk space once in 3-6 months atleast.

    On Windows, use PrivaZer instead of BleachBit.

    Since all of these are GUI tools (except tune2fs which requires no commandline hackerman knowledge), this guide is targeted towards tech literacy level of users who can atleast replace crack EXEs in pirated games on Windows.

  • bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    39
    arrow-down
    1
    ·
    1 year ago

    I usually use something like du -sh * | sort -hr | less, so you don’t need to install anything on your machine.

    • mvirts@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Same, but when it’s real bad sort fails 😅 for some reason my root is always hitting 100%

      I usually go for du -hx | sort -h and rely on my terminal scroll back.

      • bizdelnick@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Maybe, but I need it one time per year or so. It is not a task for which I want to install a separate tool.

        • meteokr@community.adiquaints.moe
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Perfect for your use case, not as much for others. People sharing tools, and all the different ways to solve this type of problem is great for everyone.

    • DigitalDilemma@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Almost the same here. Well, du -shc *|sort -hr

      I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don’t bother with less as it’s only the biggest files and dirs that I’m interested in and they show up last, so no need to scrollback.

      When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I’m never quite sure that works as advertised on small files)

      • dan@upvote.au
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I admin around three hundred linux servers

        What do you use for management? Ansible? Puppet? Chef? Something else entirely?

          • dan@upvote.au
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Interesting; I hadn’t heard of Uyuni before. Thanks for the info!

              • DigitalDilemma@lemmy.ml
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Suse forked Redhat’s Spacewalk just before it turned into Foreman + Katello.

                Then worked an absolute crapload on it to turn it into a modern orchestrator. Part of that was to adopt salt as the agent interface, gradually getting rid of the creaking EL traditional client.

                To say “it just runs salt” is to rather miss all the other stuff Uyuni does. Full repo and patch management, remote control, config management, builds, ansible playbook support, salt support, and just about everything else you need to manage hundreds of machines. Oh, and it does that for Rocky, RHEL, Alma, Suse, Ubuntu, Debian and probably a bunch more too, by now. Has a very rich webui, a full API and you can do a bunch more from the cli as well. And if your estate gets too big to manage with one machine, there are proxy agents, as many as you want. I only run a couple of hundred vms through it, but there are estates running thousands.

                And it’s free and foss.

                Honestly, it’s pretty awesome and I’m amazed it’s not more widely known.

                • cobra89@beehaw.org
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  1 year ago

                  Oh that’s pretty nifty, thanks for the comment. Sorry wasn’t trying to minimize the tool, I was simply referring to the orchestration/config management aspect of it when I looked it up real quick.

                  I used to be responsible for configurations of 40,000 (yes forty thousand) VMs for a large company using puppet and then later using Ansible and that was an interesting challenge. I’ve been out of the configuration management game for a few years now though so I’m pretty out of the loop. Was familiar with spacewalk back in the day too.

                  I’ll have to check Uyuni out, thanks for sharing!

          • pete_the_cat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yeah, I was a Linux System Admin/Engineering for MLB/Disney+ for 5 years. When I was an admin, one of our tasks was clearing out filled filesystems on hosts that alerted.

            • DigitalDilemma@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Sounds pretty similar to what I do now - but never needed the -x. Guess that might be quicker when you’re nested somewhere there is a bunch of nfs/smb stuff mounted in.

              • pete_the_cat@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                We’d do it from root (/) and drill down from there, it was usually /var/lib or /var/logs that was filling up, but occasionally someone would upload a 4.5 GB file to their home folder which has a quota of 5 GB.

                Using ncdu would have been the best way, but that would require it being installed on about 7 thousand machines.

  • crank@beehaw.org
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    check out dua. I usually use it in interactive most which lets you navigate through the file system with visual representations of total dir/file size.

    Here is a screenshot randomly found from the github issues

    screenshot

    I also recently found this gui program called k4dirstat buried in the repos. There are a few more modern options but this one blows them all out of the park.

    Screenshot from the github repo:

    screenshot

    Too bad they used such an ugly configuration for the screenshot… It allows you to modify the visualization to look better and display information differently. Anyway just thought I’d share as the project is old and little known.

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      thanks for sharing a screenshot of ncdu, should help others discover it

      for the visualization itself IMHO Disk Usage Analyzer gives aesthetically pleasing results, not a fan of the UX but it works well enough to identify efficiently large files or directories

  • Rambi@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    A 70gb log file?? Am I misunderstanding something or wouldn’t that be hundreds of millions of lines

    • Mo5560@feddit.de
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      I’ve definitely had to handle 30gb plain text files before so I am inclined to believe twice as much should be just as possible

  • donio@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    Maybe other tools support this too but one thing I like about xdiskusage is that you can pipe regular du output into it. That means that I can run du on some remote host that doesn’t have anything fancy installed, scp it back to my desktop and analyze it there. I can also pre-process the du output before feeding it into xdiskusage.

    I also often work with textual du output directly, just sorting it by size is very often all I need to see.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I miss WinDirStat for seeing where all my hard drive space went. You can spot enormous files and folders full of ISOs at a glance.

    For bit-for-bit duplicates (thanks, modern DownThemAll), use fdupes.