Army soldiers cleaning Papamoa Beach after oil from the grounded ship Rena reached shore.

Freeing up disk space on Debian

[This post is above all for my own reference, so I’ll know what to do the next time this problem occurs.]

My Thinkpad’s SSD has an impolite tendency to run out of disk space, even though I might not really be adding lots of data. It seems to be mostly cruft that Debian accumulates, perhaps out of a desire for thorough record-keeping.

Looking for ways to free up disk space, I found this StackExchange post. It has bunch of ways for giving my hard drive a good scrub.

Let’s see what they do!

Automating the process

I decided to do myself a favour, and turn this into a shell script. With a bit of help from Gemini (ahem), I ended up with a script that runs the commands below. After that, it lists the 10 biggest files in the user’s directory, and the 10 largest files in the root directory of the machine.

You can put the script in /etc/cron/weekly, so it will run automatically once per week. Be careful to remove the file ending – cron doesn’t like those. Call the script cleanup or something, rather than cleanup.sh.

To make sure that it runs even if your computer isn’t turned on at the scheduled time, install the anacron package on your system (sudo apt install anacron).

Here is the script:

#!/bin/bash
# Just to be on the safe side: This script is in the public domain.
# The level of quality and security assurance is strictly "it works on my machine". 

# Maintenance script to clean logs and package cache

# Function to get current used space on / in blocks
get_used_space() {
    df / | awk 'NR==2 {print $3}'
}

# Capture space BEFORE cleanup
PRE_CLEAN=$(get_used_space)

echo "Freeing up disk space..."

# Reduces journal logs to 300 MB
sudo journalctl --vacuum-size=300M

# Forces rotation of system logs based on configuration
sudo logrotate /etc/logrotate.conf

# Deletes downloaded package files (.deb) from the local repository
sudo apt-get clean

# Removes packages that were automatically installed and are no longer needed
sudo apt-get autoremove -y

# Capture space AFTER cleanup
POST_CLEAN=$(get_used_space)

# Calculate total freed space (in KB, then convert to MB/GB)
FREED_KB=$((PRE_CLEAN - POST_CLEAN))
FREED_HUMAN=$(echo "$FREED_KB" | numfmt --from-unit=1024 --to=iec)

echo ""
echo ">>> Total disk space freed: $FREED_HUMAN"
echo "--------------------------------------------------"


echo ""
echo "--- 10 Largest Files in Current User's Home ($HOME) ---"
# Searches only within the current user's home directory
# -type f: find files only; -printf: output size in bytes and path
# sort -rn: numeric reverse sort; numfmt: make sizes human-readable (M, G)
find "$HOME" -type f -printf "%s %p\n" 2>/dev/null | sort -rn | head -n 10 | numfmt --to=iec --field=1

echo ""
echo "--- 10 Largest Files on Root Filesystem (Excluding /home & other mounts) ---"
# -xdev prevents find from searching other mounted filesystems (like /home if it's on its own partition)
sudo find / -xdev -type f -printf "%s %p\n" 2>/dev/null | sort -rn | head -n 10 | numfmt --to=iec --field=1

echo ""
echo "Maintenance complete."
Army soldiers cleaning Papamoa Beach after oil from the grounded ship Rena reached shore.
Army soldiers cleaning Papamoa Beach after oil from the grounded ship Rena reached shore.

Helpful commands

  1. Start by emptying your Trash bin.

2. Clean up logs – this freed about 3GB of disk space.

sudo journalctl --vacuum-size=300M # reduces the logs to 300 MB

This freed about 3GB of disk space.

More log cleaning. Did not have any visible impact:

sudo logrotate /etc/logrotate.conf # compresses or (?) deletes system logs

3. Delete cached packages. I did this before starting a more systematic analysis, so I can’t say how much space it freed up. But it was easily more than one GB:

sudo apt clean # deletes packages that were cached for installation

4. Remove unused packages. Freed up a cool 1.9 GB:

sudo apt autoremove # removes unused packages

That’s all the straightforward commands that the linked post provided.

Finding large files

In addition, it told me how to find large (<100MB) files, so that I can check whether I want to delete them:

sudo find / -mount -type f -size +100M -exec du -h {} \; | sort -n

But most of the results looked like they were at least somewhat important, so I didn’t actually go and delete any of them.

Remember to floss regularly

Maybe I should simply put all these commands into a little script, and run that as a cron job? Once a month or so?