Firebase to Supabase

Medium ~3 hours Backend

Why Switch?

FirebaseSupabase
DatabaseNoSQL (Firestore)PostgreSQL
AuthFirebase AuthSupabase Auth
StorageFirebase StorageSupabase Storage
PricingPay-per-read/writePredictable tiers
Open SourceNoYes

What You'll Need

Step-by-Step Migration

1
Audit your Firebase project

Before touching any code, make a full inventory of what you're using in Firebase. Open the Firebase console and document:

Firestore: List every collection and its document structure. Note any nested sub-collections - these will need special handling.
Authentication: Which providers are enabled? (Email/password, Google, GitHub, etc.)
Storage: How many buckets? Roughly how much data?
Cloud Functions: Any server-side logic that needs to be migrated to Supabase Edge Functions?

This inventory becomes your migration checklist. Don't skip it.

2
Create a Supabase project

Head to supabase.com and create a free account. Click "New Project" and choose a region close to your users. Pick a strong database password and save it somewhere secure.

Once created, note these two values from your project settings - you'll need them everywhere:

SUPABASE_URL=https://xxxx.supabase.co
SUPABASE_ANON_KEY=eyJhbG...
3
Design your Postgres schema

This is the biggest conceptual shift. Firestore stores nested, denormalized JSON documents. Postgres uses normalized relational tables.

For example, if your Firestore has a users collection with a posts sub-collection, that becomes two tables:

CREATE TABLE users (
  id UUID DEFAULT gen_random_uuid() PRIMARY KEY,
  email TEXT UNIQUE NOT NULL,
  name TEXT,
  created_at TIMESTAMPTZ DEFAULT now()
);

CREATE TABLE posts (
  id UUID DEFAULT gen_random_uuid() PRIMARY KEY,
  user_id UUID REFERENCES users(id) ON DELETE CASCADE,
  title TEXT NOT NULL,
  content TEXT,
  created_at TIMESTAMPTZ DEFAULT now()
);

Run these SQL statements in the Supabase SQL Editor (Dashboard → SQL Editor). Design all your tables before importing data.

4
Export and migrate data

Export your Firestore data using the Firebase Admin SDK. Create a small Node.js script:

// export-firestore.js
const admin = require('firebase-admin');
const fs = require('fs');

admin.initializeApp({ credential: admin.credential.cert('./serviceAccount.json') });
const db = admin.firestore();

async function exportCollection(name) {
  const snapshot = await db.collection(name).get();
  const data = snapshot.docs.map(doc => ({ id: doc.id, ...doc.data() }));
  fs.writeFileSync(`${name}.json`, JSON.stringify(data, null, 2));
  console.log(`Exported ${data.length} docs from ${name}`);
}

exportCollection('users');
exportCollection('posts');

Then write a matching import script using the Supabase JS client to insert the data into your new tables. Transform any Firestore-specific types (timestamps, references) into Postgres-compatible formats.

5
Switch authentication

Replace the Firebase Auth SDK with Supabase Auth. Both support email/password, Google OAuth, and GitHub OAuth out of the box.

Install the Supabase client:

npm install @supabase/supabase-js

Then swap your auth calls:

// Before (Firebase)
import { getAuth, signInWithEmailAndPassword } from 'firebase/auth';
const auth = getAuth();
await signInWithEmailAndPassword(auth, email, password);

// After (Supabase)
import { createClient } from '@supabase/supabase-js';
const supabase = createClient(SUPABASE_URL, SUPABASE_ANON_KEY);
await supabase.auth.signInWithPassword({ email, password });

For OAuth providers, enable them in Supabase Dashboard → Authentication → Providers, then add your OAuth client ID and secret.

6
Migrate file storage

Download all files from Firebase Storage using gsutil (Google Cloud CLI):

gsutil -m cp -r gs://your-project.appspot.com ./firebase-files

Create a storage bucket in Supabase (Dashboard → Storage → New Bucket), then upload files using the Supabase client or dashboard. Update all file URL references in your application code and database records to point to the new Supabase Storage URLs.

7
Update your application code

Replace all Firebase SDK imports with Supabase equivalents. The biggest change is query syntax:

// Before (Firestore)
const snapshot = await db.collection('posts')
  .where('user_id', '==', userId)
  .orderBy('created_at', 'desc')
  .limit(10)
  .get();
const posts = snapshot.docs.map(doc => doc.data());

// After (Supabase)
const { data: posts } = await supabase
  .from('posts')
  .select('*')
  .eq('user_id', userId)
  .order('created_at', { ascending: false })
  .limit(10);

Supabase's query builder is chainable and returns plain objects - no more .data() calls on document snapshots.

8
Test everything

Run your application locally and test every flow end-to-end:

Auth: Sign up, sign in, sign out, password reset, OAuth flows.
Database: Create, read, update, delete operations on every table.
Storage: File uploads, downloads, and deletions.
Real-time: If you use Firestore real-time listeners, verify Supabase realtime subscriptions work correctly.

Don't forget to set up Row Level Security (RLS) policies before going to production. Unlike Firebase Security Rules, Supabase RLS is disabled by default - your data is open until you configure it.

Common Gotchas

Real-time API differences

Firestore's real-time listeners (onSnapshot) map to Supabase's realtime subscriptions, but the API is completely different. You'll need to rewrite listener logic, not just rename imports.

Row Level Security is off by default

Supabase RLS must be explicitly configured. Unlike Firestore rules which block everything by default, Supabase tables are publicly accessible until you enable and configure RLS policies.

Sub-collections need flattening

Firestore's nested sub-collections don't have a direct Postgres equivalent. Each sub-collection becomes a separate table with a foreign key reference to the parent. Plan your schema carefully.

Need help migrating?

I'll handle the Firebase to Supabase switch for you. Schema design, data migration, and code updates - all done.

Work with me →

Related Guides