How to import CSV to Supabase (beyond the 100MB limit)

Supabase is great. Its CSV import isn't.
The Supabase dashboard provides a quick way to import small datasets, but it has a hard 100MB limit. Files larger than that won't upload. And even files under 100MB can time out if they have hundreds of thousands of rows.
This guide shows you how to import CSVs of any size to Supabase, whether you're dealing with 100MB, 1GB, or larger.
The problem with Supabase's built-in CSV import
According to the Supabase documentation:
"Supabase dashboard provides a user-friendly way to import data. However, for very large datasets, this method may not be the most efficient choice, given the size limit is 100MB."
That's the official limit. But there are other issues that make the dashboard unreliable for production data:
- 100MB file size limit: Files over 100MB won't upload at all
- 1-minute timeout in SQL Editor: The dashboard timeout kills long-running imports
- No column mapping: Headers must match table columns exactly or the import fails
- No data validation: Type mismatches cause silent failures or infinite loading (GitHub Issue #13608)
- Dashboard breaks at scale: Users on Reddit report the dashboard becomes unstable around 250-450k records
If you're importing data for a demo or quick test, the dashboard works. If you're importing real production data from users, you need something more robust.
Three options for importing large CSVs to Supabase
Option 1: PostgreSQL COPY command
The fastest approach if you have direct database access:
psql "postgresql://postgres:[password]@[host]:5432/postgres" \
-c "\COPY your_table FROM '/path/to/data.csv' WITH CSV HEADER"This bypasses the dashboard entirely and uses PostgreSQL's native import. It's fast, but requires CLI access and technical knowledge. Not something you can hand off to end users.
Option 2: Batch inserts via Supabase API
For programmatic imports, split your data into chunks:
const chunkedRows = chunkArray(rows, 500);
for (const chunk of chunkedRows) {
const { error } = await supabase
.from('your_table')
.insert(chunk);
if (error) throw error;
}This approach was shared in a Supabase discussion where a developer imported 23,000 rows in 36 seconds using 500-row batches.
The downside: you have to write and maintain the chunking logic, handle errors, and deal with column mapping yourself.
Option 3: ImportCSV with Supabase destination
ImportCSV handles chunked uploads, column mapping, and validation automatically. Connect your Supabase project once, then give end users an importer they can use directly.
| Feature | Supabase Dashboard | ImportCSV |
|---|---|---|
| File size limit | 100MB | No limit (chunked) |
| Column mapping | No (exact match required) | Yes (AI-assisted) |
| Data validation | No (fails silently) | Yes (real-time feedback) |
| End-user facing | No (admin only) | Yes (embeddable widget) |
| Timeout handling | 1-minute limit | Chunked, resumable |
How to connect ImportCSV to Supabase
Step 1: Get your Supabase credentials
From your Supabase dashboard, go to Settings > API and copy:
- Project URL: The URL of your Supabase project
- Anon/public key: For client-side imports (or service role key for server-side)
Step 2: Add Supabase as a destination in ImportCSV
In your ImportCSV dashboard, go to Destinations > Add Destination > Supabase. Paste your project URL and API key.
Step 3: Create an importer with your table schema
Define the columns that map to your Supabase table. ImportCSV will auto-detect column types and suggest mappings when users upload a file.
Step 4: Embed the importer in your app
import { CSVImporter } from '@importcsv/react';
function DataImporter() {
return (
<CSVImporter
importerKey="YOUR_IMPORTER_KEY"
onComplete={(data) => {
// Data is already in Supabase
console.log(`Imported ${data.rowCount} rows`);
}}
onError={(error) => {
console.error('Import failed:', error);
}}
/>
);
}The importer handles:
- Chunked uploads for files of any size
- AI-powered column mapping (95% accuracy)
- Inline error fixing without re-uploading
- Progress tracking for long imports
When to use each approach
Use the Supabase dashboard when you're importing small files (under 100MB) as an admin and you control the CSV format.
Use PostgreSQL COPY when you need maximum speed, have database access, and the import is a one-time migration.
Use batch inserts when you're building a custom import flow and want full control over the process.
Use ImportCSV when you need to give end users a way to import their own data, you're dealing with files over 100MB, or you want validation and column mapping without writing custom code.
Troubleshooting Supabase CSV imports
Import stuck on "loading"
This usually means a data type mismatch. For example, a string value in a column defined as float4. The Supabase dashboard doesn't show validation errors - it just spins forever.
Fix: Check that your CSV data types match your table schema. ImportCSV validates data types before sending to Supabase and shows inline errors.
File too large to upload
The 100MB limit is enforced before upload starts.
Fix: Split your file into chunks under 100MB, or use ImportCSV which handles chunking automatically.
Timeout after 1 minute
The SQL Editor has a 1-minute timeout for queries, which affects large imports.
Fix: Use batch inserts with smaller chunk sizes (500-1000 rows), or use ImportCSV which splits uploads automatically.
Column names don't match
Supabase requires exact header matches. If your CSV has First Name and your table has first_name, the import fails.
Fix: Rename headers in your CSV, or use ImportCSV's column mapping to map headers to columns without modifying the source file.
Get started
ImportCSV connects to your Supabase project in two clicks. Define your schema once, then give users an importer that handles files of any size.
Start free - no credit card required.
Related posts
Wrap-up
CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.
If that sounds like the kind of tooling you want to use, try ImportCSV .