14.12.2025 21:43
17

Goback - easy backups

I have a bunch of different services on my server: a blog, an old WordPress site, a budget service in Go running in Docker, and several Telegram bots.

I started with bash scripts by default, but I only made it through 2 scripts. Each one had tons of duplicated code, and I had to set up cron jobs separately for everything. That's not even touching on the convenience of bash for large scripts.

part of bash script

But there must be ready-made solutions for setup, right?

I googled and found borgmatic / borgbackup, which seemed to promise convenient configs in yaml. But it had to be installed via apt, and I still had to write separate configs for each service and set up cron jobs individually. The configs were understandable, but still non-trivial and not intuitive.

example of borgmatic yaml

So I sat down and wrote my own configs that would fully satisfy me:

global:
  backup_dir: "/var/www/backups"
  filename_mask: "%name%-%Y%m%d%H%M%S"
  default_compression: "gzip"
  retention:
    daily: 2
    weekly: 2
    monthly: 2
    yearly: 2

backups:
  - name: "my-backup"
    subdirectory: "project"
    source_dir: "/var/www/project"

You can also use include_dir to load configs from a directory. This way you can write a separate file for each backup.

Goback

On top of the configs I already wrote, I launched Cursor to implement a utility capable of executing everything that was laid out in the configs.

That's how goback was born, a small Go utility that solves the described problems.

Single YAML config: you can configure everything in one file or automatically load configs from a subfolder. Need a new backup? Just add a YAML file specifying the directory or dump command.

Flexible backup configuration:

  • Backup specific directories with exclusions (glob patterns)
  • Execute commands (for example, mysqldump or pg_dump)
  • Different compression types: gzip, zip, tar, tar.gz, or no compression

Hooks for automation:

  • Pre-hooks: commands executed before backup (for example, stopping a service)
  • Post-hooks: commands after backup (for example, syncing with cloud)
  • Hooks can be global or separate for each backup

Smart retention policy:

  • Automatic determination of anchor points (daily, weekly, monthly, yearly)
  • Configure the number of backups for each type
  • Automatic cleanup of old backups

One 2.5 MB binary with no external dependencies. Run with one command: ./goback.

Example Configuration

global:
  backup_dir: "/var/www/backups"
  filename_mask: "%name%-%Y%m%d%H%M%S"
  default_compression: "gzip"
  retention:
    daily: 2
    weekly: 2
    monthly: 2
    yearly: 2
  post_hooks:
    - "rclone sync /var/www/backups remote:backups"

backups:
  # Directory backup with exclusions
  - name: "example-website"
    subdirectory: "example"
    source_dir: "/var/www/example"
    exclude_patterns:
      - "var/cache/*"
      - "vendor/*"
      - "*.log"
      - "node_modules/*"
    compression: "zip"

  # Database backup via command
  - name: "database-dump"
    subdirectory: "databases"
    command: "mysqldump -u user -ppassword database_name"
    output_file: "database.sql"
    compression: "gzip"
    retention:
      daily: 7
      weekly: 4
      monthly: 6
      yearly: 2

Automatic Config Loading

One of the most convenient features is automatic config loading from a directory. You can specify include_dir in the global settings, and the utility will automatically pick up all .yaml and .yml files from that folder:

global:
  include_dir: "/var/www/my/backup/backups"

Now for each new service, you just need to create one file in this directory:

# /var/www/my/backup/backups/blog.yaml
name: "blog"
subdirectory: "blog"
source_dir: "/var/www/blog"
exclude_patterns:
  - "var/cache/*"
  - "vendor/*"
compression: "zip"

And that's it. The new backup starts being created by cron along with the rest. The post-hook sync uploads everything to the cloud.

I also added a few options. Here are the main ones, with more details in the repository.

Usage

Run all backups:

./goback

Run a specific backup:

./goback -b blog

Run multiple backups:

./goback -b blog -b database -b telegram-bot

Use a custom config:

./goback -config /path/to/config.yaml

Result

In a word: beautiful. A couple of hours in Cursor, and the utility I needed is ready.

All backups in one place with unified logic, easy addition of new backups through YAML, automatic rotation and cleanup of old backups, flexible configuration for each service, post-hook with cloud sync works automatically.

And it all works out of the box, without unnecessary dependencies and complex setup. Download the binary, put the config next to it, and you're ready to go.

backup logs screen

The project is available on GitHub: https://github.com/positron48/goback

There you can find complete documentation, configuration examples, and build instructions.

P.S. I can already see how instead of app marketplaces, everything will be created on the fly.

Tags: Ubuntu AI Cursor Go
More exclusive content and real-time updates in my Telegram channel

No comments yet

Latest articles