Backups API

API endpoints for backup management.

Create, list, and restore backups via the REST API.

Endpoints Overview

MethodEndpointDescription
GET/backupsList backups
POST/backupsCreate backup
GET/backups/:idGet backup details
DELETE/backups/:idDelete backup
GET/backups/:id/downloadDownload backup
POST/backups/:id/restoreRestore backup
POST/projects/:id/import/databaseImport database dump
POST/projects/:id/import/storageImport storage files
POST/projects/:id/import/functionsImport Edge Functions
POST/projects/:id/import/lift-shiftAutomated migration from Supabase Cloud

List Backups

GET /api/v1/backups

Permission: backups:read

Query Parameters:

ParameterDescriptionExample
projectIdFilter by projectmy-project
statusFilter by statuscompleted, failed
destinationFilter by destinationlocal, s3
limitResults per page50
offsetPagination offset0
statsInclude statisticstrue

Example:

curl "https://supascale.example.com/api/v1/backups?projectId=my-project&limit=10" \
  -H "X-API-Key: your-key"

Response:

{
  "backups": [
    {
      "id": "backup-123",
      "projectId": "my-project",
      "projectName": "My Project",
      "type": "full",
      "status": "completed",
      "destination": "s3",
      "path": "s3://bucket/my-project/2026-01-19-full.tar.gz",
      "size": 52428800,
      "encrypted": false,
      "createdAt": "2026-01-19T02:00:00Z",
      "completedAt": "2026-01-19T02:02:00Z"
    }
  ],
  "pagination": {
    "total": 45,
    "limit": 10,
    "offset": 0
  }
}

With stats=true:

{
  "backups": [...],
  "stats": {
    "totalBackups": 45,
    "totalSize": 2362232012,
    "byDestination": {
      "local": { "count": 30, "size": 1073741824 },
      "s3": { "count": 15, "size": 1288490188 }
    }
  }
}

Create Backup

POST /api/v1/backups

Permission: backups:write

Request:

{
  "projectId": "my-project",
  "type": "full",
  "destination": "s3",
  "encrypt": false
}

Backup Types:

TypeDescription
fullComplete project backup
databasePostgreSQL only
storageFile storage only
functionsEdge functions only
configConfiguration only

Destinations:

DestinationDescription
localLocal filesystem
s3AWS S3
gcsGoogle Cloud Storage
azureAzure Blob Storage
r2Cloudflare R2
minioMinIO
backblazeBackblaze B2

Response:

{
  "success": true,
  "backup": {
    "id": "backup-456",
    "projectId": "my-project",
    "type": "full",
    "status": "completed",
    "destination": "s3",
    "path": "s3://bucket/my-project/2026-01-19-full.tar.gz",
    "size": 52428800,
    "duration": 120,
    "createdAt": "2026-01-19T12:00:00Z"
  }
}

Get Backup Details

GET /api/v1/backups/:id

Permission: backups:read

Response:

{
  "id": "backup-123",
  "projectId": "my-project",
  "projectName": "My Project",
  "type": "full",
  "status": "completed",
  "destination": "s3",
  "path": "s3://bucket/my-project/2026-01-19-full.tar.gz",
  "size": 52428800,
  "encrypted": false,
  "createdAt": "2026-01-19T02:00:00Z",
  "completedAt": "2026-01-19T02:02:00Z",
  "restoreHistory": [
    {
      "restoredAt": "2026-01-19T10:00:00Z",
      "duration": 180
    }
  ]
}

Delete Backup

DELETE /api/v1/backups/:id

Permission: backups:write

Response:

{
  "success": true,
  "message": "Backup deleted"
}

Download Backup

GET /api/v1/backups/:id/download

Permission: backups:read

Returns the backup file as binary download.

Headers:

Content-Type: application/gzip
Content-Disposition: attachment; filename="my-project-2026-01-19-full.tar.gz"

Example:

curl https://supascale.example.com/api/v1/backups/backup-123/download \
  -H "X-API-Key: your-key" \
  -o backup.tar.gz

Restore Backup

POST /api/v1/backups/:id/restore

Permission: backups:write

Request:

{
  "restoreTypes": ["full"],
  "forceRestore": false,
  "targetProjectId": "other-project"
}

Parameters:

ParameterTypeRequiredDescription
restoreTypesarrayNoComponents to restore (defaults to all)
forceRestorebooleanNoOverride safety checks (default: false)
targetProjectIdstringNoRestore to different project (cross-project restore)

Restore Types:

TypeDescription
fullRestore everything
databaseDatabase only
storageFile storage only
functionsEdge functions only
configConfiguration only

Cross-Project Restore:

To restore a backup to a different project, specify targetProjectId:

curl -X POST https://supascale.example.com/api/v1/backups/backup-123/restore \
  -H "X-API-Key: your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "restoreTypes": ["database", "storage"],
    "targetProjectId": "new-project"
  }'

Note: The target project must be running for database restore.

Response:

{
  "success": true,
  "restore": {
    "backupId": "backup-123",
    "projectId": "my-project",
    "duration": 180,
    "restoredAt": "2026-01-19T12:00:00Z",
    "warnings": []
  }
}

With warnings:

{
  "success": true,
  "restore": {
    "backupId": "backup-123",
    "duration": 180,
    "warnings": [
      "Some functions could not be restored due to missing dependencies"
    ]
  }
}

Error Responses

Backup Not Found

{
  "success": false,
  "error": "Backup not found"
}

Status: 404

Project Not Found

{
  "success": false,
  "error": "Project not found"
}

Status: 400

Backup Failed

{
  "success": false,
  "error": "Backup failed: Insufficient disk space"
}

Status: 500

Restore Failed

{
  "success": false,
  "error": "Restore failed: Database connection error"
}

Status: 500

Backup Status Values

StatusDescription
pendingBackup queued
in_progressBackup running
completedBackup successful
failedBackup failed

Import Endpoints

Use these endpoints to import data from external sources (like Supabase Cloud) into your Supascale projects.

Import Database

POST /api/v1/projects/:id/import/database

Permission: backups:write

Import a PostgreSQL dump file into a project's database.

Content-Type: multipart/form-data

Parameters:

ParameterTypeRequiredDescription
filefileYesDatabase dump file (.sql, .dump, or .gz)

Supported Formats:

  • .sql - Plain SQL dump
  • .dump - PostgreSQL custom format (pg_dump -Fc)
  • .gz - Gzipped dump file

Example:

curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/database \
  -H "X-API-Key: your-key" \
  -F "[email protected]"

Response:

{
  "success": true,
  "data": {
    "duration": 45000
  },
  "message": "Database imported successfully"
}

Note: The project must be running for database import.

Import Storage

POST /api/v1/projects/:id/import/storage

Permission: backups:write

Import storage files into a project.

Content-Type: multipart/form-data

Parameters:

ParameterTypeRequiredDescription
filefileYesStorage archive (.zip or .tar.gz)

Supported Formats:

  • .zip - ZIP archive
  • .tar.gz / .tgz - Gzipped tarball

Example:

curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/storage \
  -H "X-API-Key: your-key" \
  -F "[email protected]"

Response:

{
  "success": true,
  "data": {
    "duration": 12000
  },
  "message": "Storage files imported successfully"
}

Import Edge Functions

POST /api/v1/projects/:id/import/functions

Permission: backups:write

Import Edge Functions source code into a project.

Content-Type: multipart/form-data

Parameters:

ParameterTypeRequiredDescription
filefileYesFunctions archive (.zip or .tar.gz)

Archive Structure: The archive should contain function directories at the root level:

functions.tar.gz
├── hello-world/
│   └── index.ts
├── send-email/
│   └── index.ts
└── process-webhook/
    └── index.ts

Example:

# Create archive from supabase functions directory
tar -czf functions.tar.gz -C supabase functions

# Import
curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/functions \
  -H "X-API-Key: your-key" \
  -F "[email protected]"

Response:

{
  "success": true,
  "data": {
    "duration": 3000
  },
  "message": "Edge Functions imported successfully"
}

Lift & Shift Migration (Automated)

POST /api/v1/projects/:id/import/lift-shift

Permission: backups:write

Automatically migrate a Supabase Cloud project to your self-hosted Supascale instance. This endpoint connects directly to your Supabase Cloud project and transfers the selected components.

This is the recommended approach for migrating from Supabase Cloud. It handles database export/import, storage file transfers, and auth provider detection automatically.

Content-Type: application/json

Request Parameters:

ParameterTypeRequiredDescription
sourceUrlstringYesYour Supabase project URL (e.g., https://xxxxx.supabase.co)
serviceRoleKeystringYesService role key from Supabase dashboard
databasePasswordstringYesDatabase password for your Supabase project
componentsarrayYesComponents to migrate: database, storage, auth

Request:

{
  "sourceUrl": "https://hkajjfcjmpigqaziopzw.supabase.co",
  "serviceRoleKey": "eyJhbGciOi...",
  "databasePassword": "your-db-password",
  "components": ["database", "storage"]
}

Example:

curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/lift-shift \
  -H "X-API-Key: your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "sourceUrl": "https://hkajjfcjmpigqaziopzw.supabase.co",
    "serviceRoleKey": "eyJhbGciOi...",
    "databasePassword": "your-db-password",
    "components": ["database", "storage"]
  }'

Response (SSE Stream):

This endpoint returns a Server-Sent Events (SSE) stream for real-time progress updates.

Event Types:

EventDescription
startMigration started
stepStep progress update
warningNon-fatal warning
completeMigration completed successfully
errorMigration failed

Start Event:

{
  "message": "Starting Lift & Shift migration...",
  "components": ["database", "storage"],
  "targetProject": {
    "id": "my-project",
    "name": "My Project"
  }
}

Step Event:

{
  "step": "database",
  "status": "running",
  "message": "Exporting database from source...",
  "current": 50,
  "total": 100
}

Complete Event:

{
  "success": true,
  "duration": 45000,
  "warnings": [],
  "summary": {
    "database": {
      "tables": 24,
      "size": 52428800
    },
    "storage": {
      "buckets": 3,
      "files": 150,
      "size": 104857600
    }
  }
}

Error Event:

{
  "success": false,
  "error": "Failed to connect to source database",
  "duration": 5000
}

Component Options:

ComponentDescriptionRequirements
databasePostgreSQL data, schema, extensionsProject must be running
storageAll buckets and filesNone
authAuth provider detectionManual re-configuration required

Auth Provider Note: OAuth provider secrets cannot be exported from Supabase Cloud. When migrating auth configuration, you will need to re-enter client secrets and update redirect URIs in each provider's dashboard.

Handling SSE in Code:

const response = await fetch(`/api/v1/projects/${projectId}/import/lift-shift`, {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    'X-API-Key': apiKey
  },
  body: JSON.stringify({
    sourceUrl: 'https://xxxxx.supabase.co',
    serviceRoleKey: 'eyJhbGciOi...',
    databasePassword: 'password',
    components: ['database', 'storage']
  })
});

const reader = response.body.getReader();
const decoder = new TextDecoder();

while (true) {
  const { done, value } = await reader.read();
  if (done) break;

  const text = decoder.decode(value);
  // Parse SSE events
  const lines = text.split('\n');
  for (const line of lines) {
    if (line.startsWith('event: ')) {
      const event = line.slice(7);
      // Handle event type
    } else if (line.startsWith('data: ')) {
      const data = JSON.parse(line.slice(6));
      // Handle event data
    }
  }
}

Finding Your Credentials:

  1. Project URL: Supabase Dashboard → Settings → API → Project URL
  2. Service Role Key: Supabase Dashboard → Settings → API → service_role key (under "Project API keys")
  3. Database Password: The password you set when creating the project, or reset it in Settings → Database

Migration from Supabase Cloud

There are two ways to migrate from Supabase Cloud to Supascale:

Option 1: Automated Migration (Recommended)

Use the Lift & Shift endpoint for a fully automated migration:

curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/lift-shift \
  -H "X-API-Key: your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "sourceUrl": "https://xxxxx.supabase.co",
    "serviceRoleKey": "your-service-role-key",
    "databasePassword": "your-db-password",
    "components": ["database", "storage"]
  }'

This connects directly to your Supabase Cloud project and migrates the database and storage automatically. See the Lift & Shift Migration section above for details.

Or use the Import / Migrate page in the Supascale web interface, which provides a guided experience with real-time progress tracking.

Option 2: Manual Migration

If you prefer to manually export and import your data:

  1. Export your database:

    pg_dump -h db.xxxxx.supabase.co -U postgres -Fc -f backup.dump
    
  2. Download your storage files from the Supabase dashboard and create a zip archive.

  3. Archive your Edge Functions:

    tar -czf functions.tar.gz -C supabase functions
    
  4. Create a new project in Supascale and start it.

  5. Import each component:

    # Import database
    curl -X POST https://your-supascale/api/v1/projects/my-project/import/database \
      -H "X-API-Key: your-key" \
      -F "[email protected]"
    
    # Import storage
    curl -X POST https://your-supascale/api/v1/projects/my-project/import/storage \
      -H "X-API-Key: your-key" \
      -F "[email protected]"
    
    # Import functions
    curl -X POST https://your-supascale/api/v1/projects/my-project/import/functions \
      -H "X-API-Key: your-key" \
      -F "[email protected]"