Backups API
API endpoints for backup management.
Create, list, and restore backups via the REST API.
Endpoints Overview
| Method | Endpoint | Description |
|---|---|---|
| GET | /backups | List backups |
| POST | /backups | Create backup |
| GET | /backups/:id | Get backup details |
| DELETE | /backups/:id | Delete backup |
| GET | /backups/:id/download | Download backup |
| POST | /backups/:id/restore | Restore backup |
| POST | /projects/:id/import/database | Import database dump |
| POST | /projects/:id/import/storage | Import storage files |
| POST | /projects/:id/import/functions | Import Edge Functions |
| POST | /projects/:id/import/lift-shift | Automated migration from Supabase Cloud |
List Backups
GET /api/v1/backups
Permission: backups:read
Query Parameters:
| Parameter | Description | Example |
|---|---|---|
projectId | Filter by project | my-project |
status | Filter by status | completed, failed |
destination | Filter by destination | local, s3 |
limit | Results per page | 50 |
offset | Pagination offset | 0 |
stats | Include statistics | true |
Example:
curl "https://supascale.example.com/api/v1/backups?projectId=my-project&limit=10" \ -H "X-API-Key: your-key"
Response:
{
"backups": [
{
"id": "backup-123",
"projectId": "my-project",
"projectName": "My Project",
"type": "full",
"status": "completed",
"destination": "s3",
"path": "s3://bucket/my-project/2026-01-19-full.tar.gz",
"size": 52428800,
"encrypted": false,
"createdAt": "2026-01-19T02:00:00Z",
"completedAt": "2026-01-19T02:02:00Z"
}
],
"pagination": {
"total": 45,
"limit": 10,
"offset": 0
}
}
With stats=true:
{
"backups": [...],
"stats": {
"totalBackups": 45,
"totalSize": 2362232012,
"byDestination": {
"local": { "count": 30, "size": 1073741824 },
"s3": { "count": 15, "size": 1288490188 }
}
}
}
Create Backup
POST /api/v1/backups
Permission: backups:write
Request:
{
"projectId": "my-project",
"type": "full",
"destination": "s3",
"encrypt": false
}
Backup Types:
| Type | Description |
|---|---|
full | Complete project backup |
database | PostgreSQL only |
storage | File storage only |
functions | Edge functions only |
config | Configuration only |
Destinations:
| Destination | Description |
|---|---|
local | Local filesystem |
s3 | AWS S3 |
gcs | Google Cloud Storage |
azure | Azure Blob Storage |
r2 | Cloudflare R2 |
minio | MinIO |
backblaze | Backblaze B2 |
Response:
{
"success": true,
"backup": {
"id": "backup-456",
"projectId": "my-project",
"type": "full",
"status": "completed",
"destination": "s3",
"path": "s3://bucket/my-project/2026-01-19-full.tar.gz",
"size": 52428800,
"duration": 120,
"createdAt": "2026-01-19T12:00:00Z"
}
}
Get Backup Details
GET /api/v1/backups/:id
Permission: backups:read
Response:
{
"id": "backup-123",
"projectId": "my-project",
"projectName": "My Project",
"type": "full",
"status": "completed",
"destination": "s3",
"path": "s3://bucket/my-project/2026-01-19-full.tar.gz",
"size": 52428800,
"encrypted": false,
"createdAt": "2026-01-19T02:00:00Z",
"completedAt": "2026-01-19T02:02:00Z",
"restoreHistory": [
{
"restoredAt": "2026-01-19T10:00:00Z",
"duration": 180
}
]
}
Delete Backup
DELETE /api/v1/backups/:id
Permission: backups:write
Response:
{
"success": true,
"message": "Backup deleted"
}
Download Backup
GET /api/v1/backups/:id/download
Permission: backups:read
Returns the backup file as binary download.
Headers:
Content-Type: application/gzip Content-Disposition: attachment; filename="my-project-2026-01-19-full.tar.gz"
Example:
curl https://supascale.example.com/api/v1/backups/backup-123/download \ -H "X-API-Key: your-key" \ -o backup.tar.gz
Restore Backup
POST /api/v1/backups/:id/restore
Permission: backups:write
Request:
{
"restoreTypes": ["full"],
"forceRestore": false,
"targetProjectId": "other-project"
}
Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
restoreTypes | array | No | Components to restore (defaults to all) |
forceRestore | boolean | No | Override safety checks (default: false) |
targetProjectId | string | No | Restore to different project (cross-project restore) |
Restore Types:
| Type | Description |
|---|---|
full | Restore everything |
database | Database only |
storage | File storage only |
functions | Edge functions only |
config | Configuration only |
Cross-Project Restore:
To restore a backup to a different project, specify targetProjectId:
curl -X POST https://supascale.example.com/api/v1/backups/backup-123/restore \
-H "X-API-Key: your-key" \
-H "Content-Type: application/json" \
-d '{
"restoreTypes": ["database", "storage"],
"targetProjectId": "new-project"
}'
Note: The target project must be running for database restore.
Response:
{
"success": true,
"restore": {
"backupId": "backup-123",
"projectId": "my-project",
"duration": 180,
"restoredAt": "2026-01-19T12:00:00Z",
"warnings": []
}
}
With warnings:
{
"success": true,
"restore": {
"backupId": "backup-123",
"duration": 180,
"warnings": [
"Some functions could not be restored due to missing dependencies"
]
}
}
Error Responses
Backup Not Found
{
"success": false,
"error": "Backup not found"
}
Status: 404
Project Not Found
{
"success": false,
"error": "Project not found"
}
Status: 400
Backup Failed
{
"success": false,
"error": "Backup failed: Insufficient disk space"
}
Status: 500
Restore Failed
{
"success": false,
"error": "Restore failed: Database connection error"
}
Status: 500
Backup Status Values
| Status | Description |
|---|---|
pending | Backup queued |
in_progress | Backup running |
completed | Backup successful |
failed | Backup failed |
Import Endpoints
Use these endpoints to import data from external sources (like Supabase Cloud) into your Supascale projects.
Import Database
POST /api/v1/projects/:id/import/database
Permission: backups:write
Import a PostgreSQL dump file into a project's database.
Content-Type: multipart/form-data
Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
file | file | Yes | Database dump file (.sql, .dump, or .gz) |
Supported Formats:
.sql- Plain SQL dump.dump- PostgreSQL custom format (pg_dump -Fc).gz- Gzipped dump file
Example:
curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/database \ -H "X-API-Key: your-key" \ -F "[email protected]"
Response:
{
"success": true,
"data": {
"duration": 45000
},
"message": "Database imported successfully"
}
Note: The project must be running for database import.
Import Storage
POST /api/v1/projects/:id/import/storage
Permission: backups:write
Import storage files into a project.
Content-Type: multipart/form-data
Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
file | file | Yes | Storage archive (.zip or .tar.gz) |
Supported Formats:
.zip- ZIP archive.tar.gz/.tgz- Gzipped tarball
Example:
curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/storage \ -H "X-API-Key: your-key" \ -F "[email protected]"
Response:
{
"success": true,
"data": {
"duration": 12000
},
"message": "Storage files imported successfully"
}
Import Edge Functions
POST /api/v1/projects/:id/import/functions
Permission: backups:write
Import Edge Functions source code into a project.
Content-Type: multipart/form-data
Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
file | file | Yes | Functions archive (.zip or .tar.gz) |
Archive Structure: The archive should contain function directories at the root level:
functions.tar.gz
├── hello-world/
│ └── index.ts
├── send-email/
│ └── index.ts
└── process-webhook/
└── index.ts
Example:
# Create archive from supabase functions directory tar -czf functions.tar.gz -C supabase functions # Import curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/functions \ -H "X-API-Key: your-key" \ -F "[email protected]"
Response:
{
"success": true,
"data": {
"duration": 3000
},
"message": "Edge Functions imported successfully"
}
Lift & Shift Migration (Automated)
POST /api/v1/projects/:id/import/lift-shift
Permission: backups:write
Automatically migrate a Supabase Cloud project to your self-hosted Supascale instance. This endpoint connects directly to your Supabase Cloud project and transfers the selected components.
This is the recommended approach for migrating from Supabase Cloud. It handles database export/import, storage file transfers, and auth provider detection automatically.
Content-Type: application/json
Request Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
sourceUrl | string | Yes | Your Supabase project URL (e.g., https://xxxxx.supabase.co) |
serviceRoleKey | string | Yes | Service role key from Supabase dashboard |
databasePassword | string | Yes | Database password for your Supabase project |
components | array | Yes | Components to migrate: database, storage, auth |
Request:
{
"sourceUrl": "https://hkajjfcjmpigqaziopzw.supabase.co",
"serviceRoleKey": "eyJhbGciOi...",
"databasePassword": "your-db-password",
"components": ["database", "storage"]
}
Example:
curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/lift-shift \
-H "X-API-Key: your-key" \
-H "Content-Type: application/json" \
-d '{
"sourceUrl": "https://hkajjfcjmpigqaziopzw.supabase.co",
"serviceRoleKey": "eyJhbGciOi...",
"databasePassword": "your-db-password",
"components": ["database", "storage"]
}'
Response (SSE Stream):
This endpoint returns a Server-Sent Events (SSE) stream for real-time progress updates.
Event Types:
| Event | Description |
|---|---|
start | Migration started |
step | Step progress update |
warning | Non-fatal warning |
complete | Migration completed successfully |
error | Migration failed |
Start Event:
{
"message": "Starting Lift & Shift migration...",
"components": ["database", "storage"],
"targetProject": {
"id": "my-project",
"name": "My Project"
}
}
Step Event:
{
"step": "database",
"status": "running",
"message": "Exporting database from source...",
"current": 50,
"total": 100
}
Complete Event:
{
"success": true,
"duration": 45000,
"warnings": [],
"summary": {
"database": {
"tables": 24,
"size": 52428800
},
"storage": {
"buckets": 3,
"files": 150,
"size": 104857600
}
}
}
Error Event:
{
"success": false,
"error": "Failed to connect to source database",
"duration": 5000
}
Component Options:
| Component | Description | Requirements |
|---|---|---|
database | PostgreSQL data, schema, extensions | Project must be running |
storage | All buckets and files | None |
auth | Auth provider detection | Manual re-configuration required |
Auth Provider Note: OAuth provider secrets cannot be exported from Supabase Cloud. When migrating auth configuration, you will need to re-enter client secrets and update redirect URIs in each provider's dashboard.
Handling SSE in Code:
const response = await fetch(`/api/v1/projects/${projectId}/import/lift-shift`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': apiKey
},
body: JSON.stringify({
sourceUrl: 'https://xxxxx.supabase.co',
serviceRoleKey: 'eyJhbGciOi...',
databasePassword: 'password',
components: ['database', 'storage']
})
});
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const text = decoder.decode(value);
// Parse SSE events
const lines = text.split('\n');
for (const line of lines) {
if (line.startsWith('event: ')) {
const event = line.slice(7);
// Handle event type
} else if (line.startsWith('data: ')) {
const data = JSON.parse(line.slice(6));
// Handle event data
}
}
}
Finding Your Credentials:
- Project URL: Supabase Dashboard → Settings → API → Project URL
- Service Role Key: Supabase Dashboard → Settings → API →
service_rolekey (under "Project API keys") - Database Password: The password you set when creating the project, or reset it in Settings → Database
Migration from Supabase Cloud
There are two ways to migrate from Supabase Cloud to Supascale:
Option 1: Automated Migration (Recommended)
Use the Lift & Shift endpoint for a fully automated migration:
curl -X POST https://supascale.example.com/api/v1/projects/my-project/import/lift-shift \
-H "X-API-Key: your-key" \
-H "Content-Type: application/json" \
-d '{
"sourceUrl": "https://xxxxx.supabase.co",
"serviceRoleKey": "your-service-role-key",
"databasePassword": "your-db-password",
"components": ["database", "storage"]
}'
This connects directly to your Supabase Cloud project and migrates the database and storage automatically. See the Lift & Shift Migration section above for details.
Or use the Import / Migrate page in the Supascale web interface, which provides a guided experience with real-time progress tracking.
Option 2: Manual Migration
If you prefer to manually export and import your data:
Export your database:
pg_dump -h db.xxxxx.supabase.co -U postgres -Fc -f backup.dump
Download your storage files from the Supabase dashboard and create a zip archive.
Archive your Edge Functions:
tar -czf functions.tar.gz -C supabase functions
Create a new project in Supascale and start it.
Import each component:
# Import database curl -X POST https://your-supascale/api/v1/projects/my-project/import/database \ -H "X-API-Key: your-key" \ -F "[email protected]" # Import storage curl -X POST https://your-supascale/api/v1/projects/my-project/import/storage \ -H "X-API-Key: your-key" \ -F "[email protected]" # Import functions curl -X POST https://your-supascale/api/v1/projects/my-project/import/functions \ -H "X-API-Key: your-key" \ -F "[email protected]"