Skip to content

HelgeSverre/dbdump

Repository files navigation

dbdump

Header Image

Tests Release Go Report Card License: MIT dbdump

Fast MySQL dump tool that excludes noisy data while preserving complete database structure.

Why dbdump?

When dumping production databases for development, you often don't need millions of audit log entries, session data, or cache records. These tables can make dumps take hours and consume gigabytes of space.

dbdump solves this by:

  • Excluding data from noisy tables (audits, sessions, cache, etc.)
  • Always preserving table structure (no broken foreign keys)
  • Reducing dump time from hours to minutes
  • Making development database refreshes practical

Installation

Using Go

If you have Go 1.24+ installed:

go install github.com/helgesverre/dbdump/cmd/dbdump@latest

Pre-built Binaries

Download the matching versioned asset from the releases page. Asset names include the tag, for example dbdump-v1.2.0-darwin-arm64.tar.gz.

macOS (Apple Silicon)

curl -LO https://github.com/helgesverre/dbdump/releases/download/vX.Y.Z/dbdump-vX.Y.Z-darwin-arm64.tar.gz
tar -xzf dbdump-vX.Y.Z-darwin-arm64.tar.gz
chmod +x dbdump-vX.Y.Z-darwin-arm64
sudo mv dbdump-vX.Y.Z-darwin-arm64 /usr/local/bin/dbdump

macOS (Intel)

curl -LO https://github.com/helgesverre/dbdump/releases/download/vX.Y.Z/dbdump-vX.Y.Z-darwin-amd64.tar.gz
tar -xzf dbdump-vX.Y.Z-darwin-amd64.tar.gz
chmod +x dbdump-vX.Y.Z-darwin-amd64
sudo mv dbdump-vX.Y.Z-darwin-amd64 /usr/local/bin/dbdump

Linux (AMD64)

curl -LO https://github.com/helgesverre/dbdump/releases/download/vX.Y.Z/dbdump-vX.Y.Z-linux-amd64.tar.gz
tar -xzf dbdump-vX.Y.Z-linux-amd64.tar.gz
chmod +x dbdump-vX.Y.Z-linux-amd64
sudo mv dbdump-vX.Y.Z-linux-amd64 /usr/local/bin/dbdump

Linux (ARM64)

curl -LO https://github.com/helgesverre/dbdump/releases/download/vX.Y.Z/dbdump-vX.Y.Z-linux-arm64.tar.gz
tar -xzf dbdump-vX.Y.Z-linux-arm64.tar.gz
chmod +x dbdump-vX.Y.Z-linux-arm64
sudo mv dbdump-vX.Y.Z-linux-arm64 /usr/local/bin/dbdump

Windows (AMD64)

Download the versioned Windows asset from the releases page, extract it, and add the executable to your PATH.

Verify Installation

dbdump --help

Requirements

  • MySQL client tools - mysqldump must be in your PATH (comes with MySQL client)
    • macOS: brew install mysql-client
    • Ubuntu/Debian: sudo apt-get install mysql-client
    • CentOS/RHEL: sudo yum install mysql

From Source (Developers)

git clone https://github.com/helgesverre/dbdump.git
cd dbdump
just install

Requires Go 1.24+ and just.

Quick Start

Interactive Mode (Default)

# Recommended: Use environment variable for password
export DBDUMP_MYSQL_PWD=yourpassword
dbdump dump -H localhost -u root -d mydb

# Or provide password as flag (less secure)
dbdump dump -H localhost -u root -p password -d mydb

This will:

  1. Connect to your database
  2. Show all tables with sizes
  3. Pre-select noisy tables based on patterns
  4. Let you customize the selection
  5. Dump structure for all tables, data for selected tables

Auto Mode (Non-Interactive)

dbdump dump -H localhost -u root -d mydb --auto

Uses smart defaults without interaction.

With Config File

dbdump dump -H localhost -u root -d mydb --config ./project.yaml

Usage

Basic Commands

# Set password securely via environment variable
export DBDUMP_MYSQL_PWD=yourpassword

# Dump database (interactive)
dbdump dump -H localhost -u root -d mydb

# List tables with sizes
dbdump list -H localhost -u root -d mydb

# Dry run (see what would be excluded)
dbdump dump -H localhost -u root -d mydb --dry-run

# Dump with custom output file
dbdump dump -H localhost -u root -d mydb -o backup.sql

# Dump with streaming compression
dbdump dump -H localhost -u root -d mydb --auto --compress gzip

# Dump through an SSH bastion
dbdump dump -H 127.0.0.1 -P 3306 -u root -d mydb \
  --ssh-host bastion.example.com \
  --ssh-user deploy

Connection Options

-H, --host        Database host (default: 127.0.0.1)
-P, --port        Database port (default: 3306)
-u, --user        Database user
-p, --password    Database password (or use DBDUMP_MYSQL_PWD/MYSQL_PWD env)
-d, --database    Database name

Dump Options

-o, --output           Output file (default: {database}_{timestamp}.sql)
-c, --config           Config file path
    --exclude          Exclude specific table data (repeatable)
    --exclude-pattern  Exclude tables matching pattern (repeatable)
    --auto             Use smart defaults without interaction
    --dry-run          Show what would be dumped without dumping
    --compress         Compression format: auto, none, gzip, zstd
    --ssh-host         SSH bastion host for tunneling to the database
    --ssh-port         SSH bastion port (default: 22)
    --ssh-user         SSH username (defaults to database user)
    --ssh-key          SSH private key path
    --ssh-local-port   Local port for the SSH tunnel (default: auto)

Compression

dbdump can stream output as plain SQL, gzip, or zstd. --compress auto infers the format from the output filename, while explicit formats override filename inference.

If you omit --output, compressed dumps use matching default names such as mydb_20260331_120000.sql.gz or mydb_20260331_120000.sql.zst.

SSH Tunneling

You can still create a manual tunnel yourself, but dbdump can now manage it directly:

dbdump dump -H 127.0.0.1 -P 3306 -u root -d mydb \
  --ssh-host bastion.example.com \
  --ssh-user deploy \
  --ssh-key ~/.ssh/id_ed25519 \
  --auto

When SSH tunneling is enabled, -H/--host and -P/--port still describe the database endpoint as seen from the SSH server.

Examples

# Use environment variable for password
export DBDUMP_MYSQL_PWD=secret
dbdump dump -H prod.example.com -u readonly -d myapp_prod

# Exclude specific tables
dbdump dump -H localhost -u root -d mydb \
  --exclude audits \
  --exclude activity_logs \
  --exclude-pattern "temp_*"

# With project config
dbdump dump -H localhost -u root -d mydb --config ./myproject.yaml

# Auto mode with custom output
dbdump dump -H localhost -u root -d mydb --auto -o daily-backup.sql

# Auto mode with gzip output
dbdump dump -H localhost -u root -d mydb --auto --compress gzip

# Remote dump through SSH
dbdump dump -H 127.0.0.1 -P 3306 -u root -d mydb \
  --ssh-host bastion.example.com \
  --ssh-user deploy \
  --auto

Configuration

dbdump supports multiple configuration layers that merge together:

  1. Built-in defaults (always applied)
  2. Global user config (~/.dbdump.yaml) - optional, applies to all dumps
  3. Project config (via --config flag) - optional, project-specific
  4. CLI flags (highest priority)

For a comprehensive guide, see USER-GUIDE.md.

Global User Config

Create ~/.dbdump.yaml for settings that apply to all your dumps:

name: "My Global Config"

exclude:
  exact:
    - activity_logs
    - user_sessions
  patterns:
    - "temp_*"
    - "*_backup"

Project Config File

Create a project.yaml file in your project:

name: "My Project"

exclude:
  exact:
    - audits
    - activity_logs
    - custom_noisy_table
  patterns:
    - "temp_*"
    - "*_cache"
    - "old_*"

Use it with:

dbdump dump -H localhost -u root -d mydb --config ./project.yaml

Default Exclusions

dbdump includes smart defaults for common Laravel tables:

Exact matches:

  • activity_log
  • audits
  • sessions
  • cache
  • cache_locks
  • failed_jobs
  • telescope_entries
  • telescope_entries_tags
  • telescope_monitoring
  • pulse_entries
  • pulse_aggregates

Patterns:

  • telescope_*
  • pulse_*
  • *_cache

These defaults are always applied. Later layers can add more excludes, but they do not remove the built-ins.

How It Works

dbdump uses a two-phase approach:

  1. Phase 1: Structure Dump

    • Dumps complete schema for ALL tables
    • Ensures foreign keys and relationships are preserved
    • Uses mysqldump --no-data
  2. Phase 2: Data Dump

    • Dumps data for all tables EXCEPT excluded ones
    • Uses mysqldump --no-create-info --ignore-table=...

Result: A complete database dump with empty noisy tables.

Real-World Example

Before (standard mysqldump):

  • Database: 15GB total
  • Audits table: 12GB (10M rows)
  • Actual data needed: 3GB
  • Dump time: 3-4 hours
  • Transfer time: 2+ hours

After (using dbdump):

  • Excludes: audits, telescope_entries, sessions
  • Output: 3.2GB (structure for all, data for non-noisy)
  • Dump time: 15-20 minutes
  • Transfer time: 30 minutes

Time saved: 4-5 hours per database refresh

Note: Performance improvements vary based on database structure, server resources, and excluded table sizes. Typical improvements range from 5-20% faster than equivalent mysqldump commands.

Documentation

User Documentation

  • USER-GUIDE.md - Comprehensive user guide with detailed configuration, examples, and troubleshooting
  • SECURITY.md - Security best practices and credential handling
  • CHANGELOG.md - Version history

Developer Documentation

Development

Building

# Build for current platform
just build

# Build for all platforms
just build-all

# Run tests
just test

# Format code
just fmt

Testing

Integration Tests

# Run the full integration test suite (starts Docker Compose locally)
./test/integration-test.sh

# Cleanup
docker compose down -v

See test/README.md for detailed testing documentation.

Manual Testing

# Test security (password not in process list)
export DBDUMP_MYSQL_PWD=testpass123
./bin/dbdump dump -H 127.0.0.1 -P 3308 -u root -d testdb --auto
ps aux | grep dbdump  # Should NOT show password

# Verify file permissions (should be 0600)
ls -la testdb_*.sql

# Test data integrity (triggers and events are included; routines are intentionally omitted)
grep -i "CREATE TRIGGER" testdb_*.sql
grep -i "CREATE EVENT" testdb_*.sql

Project Structure

dump-tool/
├── cmd/dbdump/          # CLI entry point
├── internal/
│   ├── config/          # Configuration management
│   ├── database/        # Database operations
│   ├── patterns/        # Pattern matching
│   └── ui/              # Interactive UI
├── configs/             # Example configurations
└── justfile             # Build commands

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see LICENSE file for details

Author

Helge Sverre

Acknowledgments

Branding

About

A fast, intelligent MySQL database dumping tool that excludes noisy table data while preserving structure. Reduces dump times from hours to minutes.

Topics

Resources

License

Security policy

Stars

Watchers

Forks

Contributors