Import CSV to Supabase: Complete React Tutorial

Supabase provides a PostgreSQL database with a REST API, making it a popular choice for React applications. When your users need to import data from CSV files, you have several options: the Supabase dashboard, direct PostgreSQL access, or programmatic import through the JavaScript client.
This tutorial covers the programmatic approach: building a React component that parses CSV files and inserts the data into Supabase. You will learn batch processing patterns that work within API limits, proper error handling, and TypeScript integration.
Prerequisites
- Supabase account and project
- Node.js 20+ (Supabase JS SDK dropped Node 18 support in v2.79.0)
- React 18+ application (Vite recommended)
- Basic understanding of React hooks
Architecture overview
The CSV import flow involves three stages:
- Parse: Read the CSV file and convert it to JavaScript objects using a library like PapaParser
- Validate: Check that the data matches your database schema before attempting insertion
- Insert: Send the data to Supabase in batches to avoid rate limits
[CSV File] → [PapaParser] → [Validation] → [Batch Insert] → [Supabase]
Each stage can fail independently, so the implementation needs error handling at every step.
Step 1: Set up Supabase
First, create a table in your Supabase project. For this tutorial, we will use a contacts table.
In the Supabase dashboard, navigate to the SQL Editor and run:
CREATE TABLE contacts (
id bigint GENERATED ALWAYS AS IDENTITY PRIMARY KEY,
name text NOT NULL,
email text NOT NULL,
company text,
created_at timestamptz DEFAULT now()
);
-- Enable Row Level Security (recommended for production)
ALTER TABLE contacts ENABLE ROW LEVEL SECURITY;
-- Create a policy for authenticated users
CREATE POLICY "Users can insert contacts"
ON contacts FOR INSERT
TO authenticated
WITH CHECK (true);
CREATE POLICY "Users can view contacts"
ON contacts FOR SELECT
TO authenticated
USING (true);
Next, get your Supabase credentials from the project settings:
- Go to Settings > API
- Copy the Project URL
- Copy the anon/public key
Step 2: Install dependencies
npm install @supabase/supabase-js papaparse
npm install -D @types/papaparseThe @supabase/supabase-js package (version 2.90.1 as of this writing) is an isomorphic JavaScript SDK that works in browsers, Node.js, Deno, and Bun. PapaParser handles CSV parsing with good performance and edge case handling.
Step 3: Configure environment variables
Create a .env file in your project root (for Vite projects):
VITE_SUPABASE_URL=https://your-project.supabase.co
VITE_SUPABASE_ANON_KEY=your-anon-key
For Create React App, use REACT_APP_ prefix instead of VITE_.
Step 4: Initialize the Supabase client
Create a Supabase client file:
// src/lib/supabase.ts
import { createClient } from '@supabase/supabase-js'
import type { Database } from './database.types'
const supabaseUrl = import.meta.env.VITE_SUPABASE_URL
const supabaseAnonKey = import.meta.env.VITE_SUPABASE_ANON_KEY
if (!supabaseUrl || !supabaseAnonKey) {
throw new Error('Missing Supabase environment variables')
}
export const supabase = createClient<Database>(supabaseUrl, supabaseAnonKey)To generate TypeScript types for your database schema, run:
npx supabase gen types typescript --project-id your-project-id > src/lib/database.types.tsThis creates type definitions that provide autocomplete and type checking for your database queries.
Step 5: Build the CSV parser
Create a utility function to parse CSV files:
// src/lib/csv-parser.ts
import Papa from 'papaparse'
export interface ParsedCSV<T = Record<string, unknown>> {
data: T[]
errors: Papa.ParseError[]
meta: Papa.ParseMeta
}
export function parseCSVFile<T = Record<string, unknown>>(
file: File
): Promise<ParsedCSV<T>> {
return new Promise((resolve, reject) => {
Papa.parse<T>(file, {
header: true,
skipEmptyLines: true,
transformHeader: (header) => header.trim().toLowerCase(),
complete: (results) => {
resolve({
data: results.data,
errors: results.errors,
meta: results.meta,
})
},
error: (error) => {
reject(error)
},
})
})
}The transformHeader option normalizes column names to lowercase, which helps when CSV headers do not exactly match your database columns.
Step 6: Create the validation layer
Before inserting data, validate that required fields exist and data types are correct:
// src/lib/validators.ts
export interface ContactRow {
name: string
email: string
company?: string
}
export interface ValidationResult {
valid: ContactRow[]
invalid: Array<{
row: number
data: Record<string, unknown>
errors: string[]
}>
}
export function validateContacts(
data: Record<string, unknown>[]
): ValidationResult {
const valid: ContactRow[] = []
const invalid: ValidationResult['invalid'] = []
data.forEach((row, index) => {
const errors: string[] = []
// Check required fields
if (!row.name || typeof row.name !== 'string' || row.name.trim() === '') {
errors.push('Name is required')
}
if (!row.email || typeof row.email !== 'string') {
errors.push('Email is required')
} else if (!isValidEmail(row.email)) {
errors.push('Invalid email format')
}
if (errors.length > 0) {
invalid.push({ row: index + 1, data: row, errors })
} else {
valid.push({
name: String(row.name).trim(),
email: String(row.email).trim().toLowerCase(),
company: row.company ? String(row.company).trim() : undefined,
})
}
})
return { valid, invalid }
}
function isValidEmail(email: string): boolean {
return /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(email)
}Step 7: Implement batch insertion
Supabase can handle up to 1,000 inserts per second on the free tier, but sending thousands of rows in a single request can cause timeouts. Batch your inserts to avoid these issues:
// src/lib/batch-insert.ts
import { supabase } from './supabase'
import type { ContactRow } from './validators'
const BATCH_SIZE = 100
export interface InsertProgress {
total: number
completed: number
failed: number
errors: Array<{ batch: number; error: string }>
}
export async function insertContactsBatch(
contacts: ContactRow[],
onProgress?: (progress: InsertProgress) => void
): Promise<InsertProgress> {
const progress: InsertProgress = {
total: contacts.length,
completed: 0,
failed: 0,
errors: [],
}
const batches = Math.ceil(contacts.length / BATCH_SIZE)
for (let i = 0; i < batches; i++) {
const start = i * BATCH_SIZE
const end = Math.min(start + BATCH_SIZE, contacts.length)
const batch = contacts.slice(start, end)
const { error } = await supabase.from('contacts').insert(batch)
if (error) {
progress.failed += batch.length
progress.errors.push({
batch: i + 1,
error: error.message,
})
} else {
progress.completed += batch.length
}
onProgress?.(progress)
// Small delay between batches to avoid rate limiting
if (i < batches - 1) {
await new Promise((resolve) => setTimeout(resolve, 100))
}
}
return progress
}The 100ms delay between batches prevents hitting rate limits on free tier projects. Adjust based on your Supabase plan and insert performance requirements.
Step 8: Build the React component
Combine all the pieces into a complete import component:
// src/components/CSVImporter.tsx
import { useState, useCallback, ChangeEvent } from 'react'
import { parseCSVFile } from '../lib/csv-parser'
import { validateContacts, ValidationResult, ContactRow } from '../lib/validators'
import { insertContactsBatch, InsertProgress } from '../lib/batch-insert'
type ImportStage = 'idle' | 'parsing' | 'validating' | 'preview' | 'importing' | 'complete' | 'error'
interface ImportState {
stage: ImportStage
fileName: string | null
validationResult: ValidationResult | null
insertProgress: InsertProgress | null
error: string | null
}
export function CSVImporter() {
const [state, setState] = useState<ImportState>({
stage: 'idle',
fileName: null,
validationResult: null,
insertProgress: null,
error: null,
})
const handleFileSelect = useCallback(async (e: ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0]
if (!file) return
// Reset state
setState({
stage: 'parsing',
fileName: file.name,
validationResult: null,
insertProgress: null,
error: null,
})
try {
// Parse CSV
const parsed = await parseCSVFile(file)
if (parsed.errors.length > 0) {
setState((prev) => ({
...prev,
stage: 'error',
error: `CSV parsing errors: ${parsed.errors.map((e) => e.message).join(', ')}`,
}))
return
}
if (parsed.data.length === 0) {
setState((prev) => ({
...prev,
stage: 'error',
error: 'The CSV file is empty',
}))
return
}
// Validate data
setState((prev) => ({ ...prev, stage: 'validating' }))
const validationResult = validateContacts(parsed.data)
setState((prev) => ({
...prev,
stage: 'preview',
validationResult,
}))
} catch (err) {
setState((prev) => ({
...prev,
stage: 'error',
error: err instanceof Error ? err.message : 'Failed to parse CSV file',
}))
}
// Reset file input
e.target.value = ''
}, [])
const handleImport = useCallback(async () => {
if (!state.validationResult || state.validationResult.valid.length === 0) {
return
}
setState((prev) => ({ ...prev, stage: 'importing' }))
try {
const progress = await insertContactsBatch(
state.validationResult.valid,
(progress) => {
setState((prev) => ({ ...prev, insertProgress: progress }))
}
)
setState((prev) => ({
...prev,
stage: 'complete',
insertProgress: progress,
}))
} catch (err) {
setState((prev) => ({
...prev,
stage: 'error',
error: err instanceof Error ? err.message : 'Import failed',
}))
}
}, [state.validationResult])
const handleReset = useCallback(() => {
setState({
stage: 'idle',
fileName: null,
validationResult: null,
insertProgress: null,
error: null,
})
}, [])
return (
<div style={{ maxWidth: '600px', margin: '0 auto', padding: '24px' }}>
<h2>Import Contacts from CSV</h2>
{state.stage === 'idle' && (
<div
style={{
border: '2px dashed #ccc',
borderRadius: '8px',
padding: '40px',
textAlign: 'center',
}}
>
<input
type="file"
accept=".csv"
onChange={handleFileSelect}
style={{ display: 'none' }}
id="csv-input"
/>
<label htmlFor="csv-input" style={{ cursor: 'pointer' }}>
<p>Click to select a CSV file</p>
<p style={{ color: '#666', fontSize: '14px' }}>
Expected columns: name, email, company (optional)
</p>
</label>
</div>
)}
{(state.stage === 'parsing' || state.stage === 'validating') && (
<div style={{ textAlign: 'center', padding: '40px' }}>
<p>
{state.stage === 'parsing' ? 'Parsing CSV...' : 'Validating data...'}
</p>
</div>
)}
{state.stage === 'preview' && state.validationResult && (
<div>
<h3>Preview</h3>
<p>
File: <strong>{state.fileName}</strong>
</p>
<p style={{ color: 'green' }}>
Valid rows: {state.validationResult.valid.length}
</p>
{state.validationResult.invalid.length > 0 && (
<div style={{ marginTop: '16px' }}>
<p style={{ color: 'red' }}>
Invalid rows: {state.validationResult.invalid.length}
</p>
<details>
<summary>View errors</summary>
<ul style={{ fontSize: '14px' }}>
{state.validationResult.invalid.slice(0, 10).map((item) => (
<li key={item.row}>
Row {item.row}: {item.errors.join(', ')}
</li>
))}
{state.validationResult.invalid.length > 10 && (
<li>
...and {state.validationResult.invalid.length - 10} more
</li>
)}
</ul>
</details>
</div>
)}
<div style={{ marginTop: '24px', display: 'flex', gap: '12px' }}>
<button onClick={handleReset}>Cancel</button>
<button
onClick={handleImport}
disabled={state.validationResult.valid.length === 0}
style={{
backgroundColor: '#007bff',
color: 'white',
border: 'none',
padding: '8px 16px',
borderRadius: '4px',
cursor: 'pointer',
}}
>
Import {state.validationResult.valid.length} contacts
</button>
</div>
</div>
)}
{state.stage === 'importing' && state.insertProgress && (
<div style={{ textAlign: 'center', padding: '40px' }}>
<p>Importing...</p>
<progress
value={state.insertProgress.completed + state.insertProgress.failed}
max={state.insertProgress.total}
style={{ width: '100%' }}
/>
<p style={{ fontSize: '14px', color: '#666' }}>
{state.insertProgress.completed} of {state.insertProgress.total} rows
</p>
</div>
)}
{state.stage === 'complete' && state.insertProgress && (
<div style={{ textAlign: 'center', padding: '40px' }}>
<h3 style={{ color: 'green' }}>Import Complete</h3>
<p>Successfully imported: {state.insertProgress.completed}</p>
{state.insertProgress.failed > 0 && (
<p style={{ color: 'red' }}>Failed: {state.insertProgress.failed}</p>
)}
<button onClick={handleReset} style={{ marginTop: '16px' }}>
Import another file
</button>
</div>
)}
{state.stage === 'error' && (
<div style={{ textAlign: 'center', padding: '40px' }}>
<p style={{ color: 'red' }}>{state.error}</p>
<button onClick={handleReset} style={{ marginTop: '16px' }}>
Try again
</button>
</div>
)}
</div>
)
}Complete working example
Here is the full file structure for a working implementation:
src/
├── components/
│ └── CSVImporter.tsx
├── lib/
│ ├── supabase.ts
│ ├── database.types.ts (generated)
│ ├── csv-parser.ts
│ ├── validators.ts
│ └── batch-insert.ts
└── App.tsx
And the App component:
// src/App.tsx
import { CSVImporter } from './components/CSVImporter'
export default function App() {
return (
<div>
<h1>CSV to Supabase Import</h1>
<CSVImporter />
</div>
)
}Supabase dashboard import alternative
For one-time imports or datasets under 100MB, the Supabase dashboard provides a built-in CSV importer:
- Navigate to Table Editor in your Supabase dashboard
- Click "+ New table" or select an existing table
- Click "Insert" then "Import Data from CSV"
- Upload your CSV file
Limitations of dashboard import:
- 100MB file size limit
- CSV import is only available when creating new tables, not for appending to existing tables
- No custom validation or transformation
Use the programmatic approach from this tutorial when you need:
- User-facing import functionality in your application
- Custom validation rules
- Data transformation before insertion
- Progress feedback for large imports
- Automated or recurring imports
Troubleshooting
Rate limiting on bulk inserts
If you encounter rate limit errors during large imports, reduce the batch size and increase the delay:
const BATCH_SIZE = 50 // Reduced from 100
const DELAY_MS = 200 // Increased from 100Supabase free tier supports up to 1,000 inserts per second. For larger imports, consider:
- Using the PostgreSQL
COPYcommand via direct database connection - Upgrading to a paid tier for increased resources
- Temporarily disabling triggers during import
Data type mismatches
CSV data is always strings. If your Supabase table has non-text columns (numbers, booleans, dates), convert the data before insertion:
// In your validator, convert types explicitly
const contact = {
name: String(row.name),
email: String(row.email),
age: row.age ? parseInt(String(row.age), 10) : null,
active: row.active === 'true' || row.active === '1',
created_at: row.created_at ? new Date(row.created_at).toISOString() : undefined,
}Statement timeouts on large datasets
For imports of 10,000+ rows, you may hit statement timeout limits. The Supabase docs recommend:
- Increasing statement timeout for your session before import
- Disabling triggers temporarily:
ALTER TABLE contacts DISABLE TRIGGER ALL;
-- Perform import
ALTER TABLE contacts ENABLE TRIGGER ALL;
TypeScript errors with Supabase types
If TypeScript complains about table names or column types, regenerate your database types:
npx supabase gen types typescript --project-id your-project-id > src/lib/database.types.tsMake sure your Supabase client is typed with these definitions:
import type { Database } from './database.types'
const supabase = createClient<Database>(url, key)Next steps
- Add drag-and-drop file upload for better UX
- Implement column mapping for CSVs with different header names
- Add duplicate detection before insertion
- Create an export feature to download data as CSV
- Explore ImportCSV docs for a production-ready solution
A simpler alternative
Building CSV import from scratch requires handling many edge cases: file format detection, column mapping, validation rules, error reporting, and progress tracking. The code in this tutorial covers the basics, but production applications typically need more.
ImportCSV is an open-source React component that provides:
- Drag-and-drop file upload with format detection (CSV, XLSX, XLS)
- Visual column mapping interface
- Built-in validation with custom rules
- Row-level error reporting
- Progress tracking for large files
import { CSVImporter } from '@importcsv/react'
<CSVImporter
onComplete={(data) => {
// Insert validated data into Supabase
await supabase.from('contacts').insert(data.rows)
}}
columns={[
{ label: 'Name', key: 'name', required: true },
{ label: 'Email', key: 'email', required: true },
{ label: 'Company', key: 'company' },
]}
/>If you need CSV import functionality for a production application, ImportCSV handles the complexity so you can focus on your core product.
Related posts
Wrap-up
CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.
If that sounds like the kind of tooling you want to use, try ImportCSV .