One of the most common questions from developers running self-hosted Supabase is: "How do I set up a proper development workflow?" Unlike Supabase Cloud users who can rely on the integrated branching features, self-hosted deployments require a more hands-on approach to managing local development, schema migrations, and deployments to production.
This guide walks you through setting up a professional local development workflow that mirrors what you would get with a managed service, while maintaining the control and cost benefits of self-hosting.
Why Local Development Matters for Self-Hosted Supabase
When you self-host Supabase, you gain full control over your data and infrastructure. But that control comes with responsibility. Making schema changes directly on your production database is a recipe for disaster. You need a proper workflow that lets you:
- Develop and test changes locally before touching production
- Version control your database schema alongside your application code
- Collaborate with team members without stepping on each other's changes
- Roll back problematic migrations when things go wrong
The good news is that the Supabase CLI works seamlessly with self-hosted instances. You just need to know how to configure it properly.
Prerequisites
Before diving in, make sure you have:
- A running self-hosted Supabase instance (see our deployment guide if you need help setting this up)
- Docker installed on your development machine
- The Supabase CLI installed (
npm install -g supabaseor via Homebrew) - Your production database connection string
Setting Up Your Local Development Environment
Step 1: Initialize Your Project
Start by creating a new directory for your project (or navigating to an existing one) and initialize Supabase:
mkdir my-project && cd my-project supabase init
This creates a supabase directory with a config.toml file. This configuration controls your local development stack.
Step 2: Start the Local Stack
Run the local Supabase stack:
supabase start
This spins up a complete Supabase environment locally: PostgreSQL, Auth, Storage, Realtime, Edge Functions, and Studio. The first run takes a few minutes as it pulls the required Docker images.
Once started, you will see output showing your local URLs and credentials:
Started supabase local development setup.
API URL: http://localhost:54321
GraphQL URL: http://localhost:54321/graphql/v1
DB URL: postgresql://postgres:postgres@localhost:54322/postgres
Studio URL: http://localhost:54323
Inbucket URL: http://localhost:54324
JWT secret: super-secret-jwt-token-with-at-least-32-characters
anon key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...
service_role key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...
Bookmark the Studio URL (http://localhost:54323) for visual database management during development.
Step 3: Configure Connection to Your Self-Hosted Instance
For self-hosted Supabase, you will not use supabase link --project-ref like you would with Supabase Cloud. Instead, you will use the --db-url flag when running commands against your remote database.
Store your production database URL as an environment variable:
export SUPABASE_DB_URL="postgresql://postgres:your-password@your-server:5432/postgres"
For team environments, add this to your shell profile or use a tool like direnv to manage environment variables per project.
The Migration Workflow
Now comes the important part: managing schema changes through migrations.
Creating Migrations
You have two approaches for creating migrations:
Approach 1: Dashboard First (Visual)
Make your schema changes in the local Studio dashboard, then capture them as a migration:
supabase db diff -f add_users_table
This generates a migration file in supabase/migrations/ containing the SQL to recreate your changes.
Approach 2: SQL First (Code)
Create a blank migration and write the SQL directly:
supabase migration new add_users_table
Then edit the generated file in supabase/migrations/[timestamp]_add_users_table.sql:
CREATE TABLE public.users ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), email TEXT UNIQUE NOT NULL, display_name TEXT, created_at TIMESTAMPTZ DEFAULT NOW() ); ALTER TABLE public.users ENABLE ROW LEVEL SECURITY; CREATE POLICY "Users can read own data" ON public.users FOR SELECT USING (auth.uid() = id);
Testing Migrations Locally
After creating a migration, test it by resetting your local database:
supabase db reset
This drops your local database, reapplies all migrations from scratch, and runs your seed file if you have one. If there are errors, you will catch them here rather than in production.
Pushing to Your Self-Hosted Instance
Once your migrations work locally, push them to your self-hosted Supabase:
supabase db push --db-url "$SUPABASE_DB_URL"
The CLI tracks which migrations have been applied using the supabase_migrations.schema_migrations table. Only new migrations are applied.
Database Seeding for Reproducible Environments
Seed files populate your database with initial data. Create a seed file at supabase/seed.sql:
-- Only include data, not schema (schema is handled by migrations)
INSERT INTO public.users (id, email, display_name)
VALUES
('d0fc4c64-a3d6-4e9c-8b5e-1234567890ab', '[email protected]', 'Test User');
Seeds run automatically on supabase start (first run) and supabase db reset.
For different environments, you can use conditional logic:
DO $$
BEGIN
-- Only seed in development
IF current_setting('app.environment', true) = 'development' THEN
INSERT INTO public.users (email, display_name)
VALUES ('[email protected]', 'Dev User');
END IF;
END $$;
Multi-Environment Workflow
Most teams work with multiple environments: local, staging, and production. Here is how to manage them.
Environment Variables
Create a .env file (git-ignored) for your connection strings:
LOCAL_DB_URL="postgresql://postgres:postgres@localhost:54322/postgres" STAGING_DB_URL="postgresql://postgres:[email protected]:5432/postgres" PROD_DB_URL="postgresql://postgres:[email protected]:5432/postgres"
Deployment Commands
# Test locally supabase db reset # Deploy to staging supabase db push --db-url "$STAGING_DB_URL" # Deploy to production (after staging verification) supabase db push --db-url "$PROD_DB_URL"
CI/CD Integration
For automated deployments, add migration steps to your CI/CD pipeline. Here is a GitHub Actions example:
name: Deploy Migrations
on:
push:
branches: [main]
paths:
- 'supabase/migrations/**'
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Supabase CLI
uses: supabase/setup-cli@v1
- name: Push to Staging
run: supabase db push --db-url "${{ secrets.STAGING_DB_URL }}"
- name: Run Tests
run: npm test
- name: Push to Production
run: supabase db push --db-url "${{ secrets.PROD_DB_URL }}"
For more advanced CI/CD setups, see our CI/CD pipelines guide.
Handling Schema Drift
Schema drift happens when someone makes changes directly to the remote database without going through migrations. This is dangerous but sometimes unavoidable.
Detecting Drift
Compare your migration state against the remote database:
supabase db diff --db-url "$SUPABASE_DB_URL"
If output appears, your remote database has diverged from your migrations.
Recovering from Drift
Option 1: Capture the drift as a new migration
supabase db diff --db-url "$SUPABASE_DB_URL" -f sync_remote_changes
Review the generated migration carefully, then commit it.
Option 2: Reset from clean state (staging/dev only)
If you are working with a staging environment, you might choose to wipe it and reapply all migrations:
# WARNING: This destroys all data psql "$STAGING_DB_URL" -c "DROP SCHEMA public CASCADE; CREATE SCHEMA public;" supabase db push --db-url "$STAGING_DB_URL"
Never do this on production without a backup.
Best Practices
Version Control Everything
Your supabase/migrations/ directory should be committed to git. This gives you:
- Full history of schema changes
- Ability to roll back by reverting commits
- Code review for database changes
- Consistent environments across your team
Never Modify Existing Migrations
Once a migration is applied to any environment, treat it as immutable. If you need to fix something, create a new migration:
# Wrong: editing 20240224_add_users.sql after it's applied # Right: creating a new migration supabase migration new fix_users_table
Back Up Before Deploying
Always back up your database before applying migrations to production. With Supascale, you can create on-demand backups with a single click or API call.
Test with Production-Like Data
Your local seed file should include realistic test data. Consider anonymizing a subset of production data for staging environments.
How Supascale Simplifies This Workflow
While the CLI gives you full control over migrations, managing backups and deployments across multiple self-hosted instances can still be tedious. Supascale streamlines the operational side:
- Automated Backups: Schedule S3-compatible backups so you always have a restore point before migrations
- One-Click Restore: If a migration causes issues, restore to a previous state without command-line gymnastics
- Multi-Project Management: Manage staging and production instances from a single dashboard
- Environment Variables UI: Configure your environment variables without editing Docker Compose files
The CLI handles your development workflow. Supascale handles the operational burden so you can focus on building.
Troubleshooting Common Issues
"Migration has already been applied"
This means the migration exists in your supabase_migrations.schema_migrations table. If you need to rerun it:
-- Remove the migration record (careful!) DELETE FROM supabase_migrations.schema_migrations WHERE version = '20240224123456';
Then run supabase db push again.
"Connection refused" when pushing
Verify your database URL and check that:
- Port 5432 (or your configured port) is accessible
- Your IP is allowed in any firewall rules
- The password does not contain special characters that need escaping
Docker errors on supabase start
Reset your local Docker state:
supabase stop --no-backup docker system prune -f supabase start
Conclusion
A proper local development workflow transforms self-hosted Supabase from a manual, error-prone setup into a professional development environment. By using the Supabase CLI with your self-hosted instance, you get version-controlled migrations, reproducible environments, and confidence that what works locally will work in production.
The key is discipline: make changes locally first, test thoroughly, and use migrations for all schema changes. Combined with proper backup procedures, you will have a development workflow that rivals any managed service.
Ready to simplify the operational side of self-hosted Supabase? Check out Supascale's features or see our pricing for unlimited projects at a one-time cost.
