Backup Issues

Troubleshooting backup and restore problems.

Solutions to common backup and restore problems in Supascale.

Backup Creation Issues

"Insufficient disk space"

Error:

{
  "success": false,
  "error": "Backup failed: Insufficient disk space"
}

Solutions:

  1. Check available space:

    df -h
    df -h ~/.supascale_backups/
    
  2. Free up space:

    # Remove old backups
    ls -la ~/.supascale_backups/
    rm -rf ~/.supascale_backups/old-backup.tar.gz
    
    # Clean Docker
    docker system prune -a
    
  3. Change backup destination:

    • Use cloud storage instead of local
    • Mount additional storage
  4. Use incremental backups instead of full backups

Backup Stuck at "Creating"

Problem: Backup stays in in_progress state indefinitely.

Diagnostic:

# Check if backup process is running
ps aux | grep pg_dump
ps aux | grep tar

# Check disk I/O
iostat -x 1

Solutions:

  1. Check database accessibility:

    docker compose exec db pg_isready
    
  2. Cancel stuck backup via API or restart Supascale

  3. Check for locked files:

    lsof | grep supascale_backups
    

"Database connection failed"

Error:

{
  "success": false,
  "error": "Backup failed: Database connection error"
}

Solutions:

  1. Verify project is running:

    cd ~/supabase/projects/your-project
    docker compose ps
    
  2. Check database container:

    docker compose logs db
    
  3. Test database connection:

    docker compose exec db psql -U postgres -c "SELECT 1"
    

"Permission denied"

Error:

{
  "success": false,
  "error": "Permission denied: Cannot write to backup directory"
}

Solutions:

  1. Check directory permissions:

    ls -la ~/.supascale_backups/
    
  2. Fix permissions:

    chmod 755 ~/.supascale_backups/
    chown -R $(whoami) ~/.supascale_backups/
    
  3. Check SELinux/AppArmor (if applicable):

    getenforce
    # If enforcing, check audit logs
    ausearch -m AVC -ts recent
    

Cloud Storage Backup Issues

S3 Upload Failed

Error:

{
  "success": false,
  "error": "Failed to upload to S3: Access Denied"
}

Solutions:

  1. Test credentials:

    aws s3 ls s3://your-bucket/
    
  2. Check IAM permissions: Required permissions:

    • s3:PutObject
    • s3:GetObject
    • s3:DeleteObject
    • s3:ListBucket
  3. Verify bucket exists and is in correct region

  4. Check bucket policy allows access from your IP

GCS Upload Failed

Error:

{
  "success": false,
  "error": "Failed to upload to GCS: Forbidden"
}

Solutions:

  1. Verify service account permissions:

    • roles/storage.objectAdmin on bucket
  2. Check credentials JSON is valid and not expired

  3. Test with gsutil:

    gsutil ls gs://your-bucket/
    

Azure Blob Upload Failed

Error:

{
  "success": false,
  "error": "Failed to upload to Azure: AuthorizationFailure"
}

Solutions:

  1. Verify account name and key

  2. Check container exists:

    az storage container list --account-name youraccountname
    
  3. Check network rules if using private endpoints

Connection Timeout

Error:

{
  "success": false,
  "error": "Connection timeout to cloud storage"
}

Solutions:

  1. Check network connectivity:

    # For S3
    curl -I https://s3.amazonaws.com
    
    # For GCS
    curl -I https://storage.googleapis.com
    
  2. Check firewall rules for outbound HTTPS

  3. Use regional endpoint instead of global

  4. Increase timeout settings

Restore Issues

"Backup not found"

Error:

{
  "success": false,
  "error": "Backup file not found"
}

Solutions:

  1. Check backup still exists:

    ls -la ~/.supascale_backups/
    
  2. For cloud backups, verify file in bucket

  3. Backup may have been deleted - check backup retention policy

Restore Stuck or Failing

Problem: Restore process hangs or fails midway.

Solutions:

  1. Check project status:

    • Project should be stopped during full restore
    • Some partial restores work while running
  2. Check available disk space:

    df -h
    
  3. Manually restore:

    # Extract backup
    tar -xzf backup.tar.gz
    
    # Restore database
    docker compose exec -T db psql -U postgres < database.sql
    

"Database version mismatch"

Error:

{
  "success": false,
  "error": "Backup was created with PostgreSQL 15, current version is 14"
}

Solutions:

  1. Upgrade project PostgreSQL to match backup version

  2. Use --compatible flag if available

  3. Manually edit dump file to remove version-specific features (not recommended)

Partial Restore Failures

Problem: Some components restore while others fail.

Solutions:

  1. Check warnings in response:

    {
      "warnings": [
        "Some functions could not be restored"
      ]
    }
    
  2. Restore components individually:

    • Try database-only restore first
    • Then storage
    • Then functions
  3. Check dependencies:

    • Extensions must be enabled
    • Required schemas must exist

Scheduled Backup Issues

Scheduled Backups Not Running

Problem: Cron backups not executing.

Diagnostic:

  1. Check task is enabled in Scheduled Tasks
  2. Check last run time and status
  3. Check system logs for errors

Solutions:

  1. Verify task configuration:

    • Correct cron expression
    • Correct timezone
    • Task enabled
  2. Check Supascale is running during scheduled time

  3. Run task manually to test:

    POST /api/v1/tasks/:id/run
    

Missing Backups

Problem: Fewer backups than expected.

Common causes:

  • Backup retention policy deleting old backups
  • Scheduled task disabled
  • Failures not reported

Solutions:

  1. Review backup history in Supascale
  2. Check task execution history
  3. Adjust retention policy if needed

Data Recovery

Recovering from Corrupted Backup

# Try to extract
tar -xzf backup.tar.gz 2>&1 | head

# If tar is corrupted, try recovery
gzip -d backup.tar.gz
tar -xf backup.tar --ignore-zeros

Point-in-Time Recovery

For critical data, consider:

  1. Enable PostgreSQL WAL archiving
  2. Set up continuous backup to cloud storage
  3. Configure shorter backup intervals

Performance Issues

Backups Taking Too Long

Optimization:

  1. Use incremental backups when possible

  2. Exclude unnecessary data:

    • Large binary files
    • Temporary tables
    • Cached data
  3. Run during low-traffic periods

  4. Use faster storage for backup destination

  5. Compress efficiently:

    • Use gzip for balance of speed/size
    • Use lz4 for faster compression

Impact on Production

Minimize backup impact:

  1. Use snapshot-based backups if available

  2. Replicate to read replica and backup from there

  3. Schedule during maintenance windows

Getting Help

When reporting backup issues, include:

  1. Backup configuration:

    • Type (full, database, storage)
    • Destination (local, S3, GCS, etc.)
  2. Error message from Supascale

  3. Backup logs from system logs

  4. Storage status:

    df -h
    
  5. Cloud storage status (if applicable):

    # Test connectivity
    aws s3 ls s3://your-bucket/