Firebase CSV Import: Upload Spreadsheets to Firestore

Importing CSV data into Firebase Firestore is a common requirement when migrating data, bulk-loading records, or building admin tools. While Firebase does not offer a native CSV import feature, you can build this functionality using batched writes with the Firebase SDK.
This guide covers two approaches: client-side imports using the Firebase JavaScript SDK v9+ and server-side imports using the Firebase Admin SDK. Both methods use Firestore's batched write feature to handle large datasets efficiently.
Prerequisites
- Firebase project with Firestore enabled
- Node.js 18+
- npm or yarn package manager
- For server-side: Firebase service account credentials
Architecture overview
CSV import to Firestore involves three steps:
- Parse CSV - Convert CSV text into JavaScript objects
- Transform data - Convert string values to appropriate Firestore types (numbers, booleans, dates)
- Batch write - Split documents into batches and commit to Firestore
Firestore's batched writes are atomic. All operations in a batch either succeed together or fail together, which helps maintain data consistency during imports.
Firestore batch write limits
Before implementing, understand these Firestore limits:
| Limit | Value |
|---|---|
| Maximum API request size | 10 MiB |
| Maximum document size | 1 MiB |
| Transaction time limit | 270 seconds |
| Maximum field depth | 20 levels |
| Maximum subcollection depth | 100 levels |
According to Firebase documentation:
"A batched write with hundreds of documents might require many index updates and might exceed the limit on transaction size. In this case, reduce the number of documents per batch."
The recommended approach is to limit batches to 500 documents to stay well under these limits.
Step 1: Set up Firebase
Create a Firebase project and enable Firestore in the Firebase Console. For client-side imports, add your Firebase configuration:
// firebase.ts
import { initializeApp } from 'firebase/app';
import { getFirestore } from 'firebase/firestore';
const firebaseConfig = {
apiKey: 'your-api-key',
authDomain: 'your-project.firebaseapp.com',
projectId: 'your-project-id',
storageBucket: 'your-project.appspot.com',
messagingSenderId: '123456789',
appId: 'your-app-id'
};
const app = initializeApp(firebaseConfig);
export const db = getFirestore(app);For server-side imports using the Admin SDK, download your service account JSON from the Firebase Console (Project Settings > Service Accounts).
Step 2: Install dependencies
npm install firebase papaparse
npm install -D @types/papaparseFor server-side with Admin SDK:
npm install firebase-admin papaparse
npm install -D @types/papaparseStep 3: Parse CSV with type conversion
CSV files store all values as strings. Use PapaParse with dynamicTyping to convert numbers and booleans automatically:
// csv-parser.ts
import Papa from 'papaparse';
interface ParseResult<T> {
data: T[];
errors: Papa.ParseError[];
}
export function parseCSV<T>(csvString: string): ParseResult<T> {
const result = Papa.parse<T>(csvString, {
header: true,
dynamicTyping: true,
skipEmptyLines: true,
transformHeader: (header) => header.trim(),
});
return {
data: result.data,
errors: result.errors,
};
}The dynamicTyping: true option converts:
- Numeric strings to numbers (e.g., "42" becomes 42)
- "true"/"false" to booleans
Step 4: Implement batch writes
Here is the core function for importing CSV data to Firestore using batched writes:
// firestore-import.ts
import { writeBatch, doc, collection, Firestore } from 'firebase/firestore';
interface ImportOptions {
collectionName: string;
idField?: string;
onProgress?: (imported: number, total: number) => void;
}
export async function importToFirestore(
db: Firestore,
data: Record<string, unknown>[],
options: ImportOptions
): Promise<{ success: number; failed: number }> {
const { collectionName, idField, onProgress } = options;
const BATCH_SIZE = 500;
let successCount = 0;
let failedCount = 0;
for (let i = 0; i < data.length; i += BATCH_SIZE) {
const batch = writeBatch(db);
const chunk = data.slice(i, i + BATCH_SIZE);
chunk.forEach((item) => {
let docRef;
if (idField && item[idField]) {
// Use specified field as document ID
docRef = doc(db, collectionName, String(item[idField]));
} else {
// Auto-generate document ID
docRef = doc(collection(db, collectionName));
}
batch.set(docRef, item);
});
try {
await batch.commit();
successCount += chunk.length;
} catch (error) {
console.error(`Batch failed at index ${i}:`, error);
failedCount += chunk.length;
}
if (onProgress) {
onProgress(successCount + failedCount, data.length);
}
}
return { success: successCount, failed: failedCount };
}Step 5: Handle file upload
Create a React component that handles file selection, parsing, and import:
// CSVImporter.tsx
import { useState, useCallback } from 'react';
import { db } from './firebase';
import { parseCSV } from './csv-parser';
import { importToFirestore } from './firestore-import';
interface ImportStatus {
stage: 'idle' | 'parsing' | 'importing' | 'complete' | 'error';
progress: number;
total: number;
message: string;
}
export function CSVImporter() {
const [status, setStatus] = useState<ImportStatus>({
stage: 'idle',
progress: 0,
total: 0,
message: '',
});
const handleFileChange = useCallback(
async (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0];
if (!file) return;
setStatus({ stage: 'parsing', progress: 0, total: 0, message: 'Parsing CSV...' });
const text = await file.text();
const { data, errors } = parseCSV<Record<string, unknown>>(text);
if (errors.length > 0) {
setStatus({
stage: 'error',
progress: 0,
total: 0,
message: `Parse errors: ${errors.map((e) => e.message).join(', ')}`,
});
return;
}
setStatus({
stage: 'importing',
progress: 0,
total: data.length,
message: `Importing ${data.length} records...`,
});
try {
const result = await importToFirestore(db, data, {
collectionName: 'imports',
onProgress: (imported, total) => {
setStatus((prev) => ({
...prev,
progress: imported,
message: `Imported ${imported} of ${total} records`,
}));
},
});
setStatus({
stage: 'complete',
progress: result.success,
total: data.length,
message: `Import complete: ${result.success} succeeded, ${result.failed} failed`,
});
} catch (error) {
setStatus({
stage: 'error',
progress: 0,
total: 0,
message: error instanceof Error ? error.message : 'Import failed',
});
}
},
[]
);
return (
<div>
<input type="file" accept=".csv" onChange={handleFileChange} disabled={status.stage === 'importing'} />
{status.stage !== 'idle' && (
<div>
<p>{status.message}</p>
{status.total > 0 && (
<progress value={status.progress} max={status.total} />
)}
</div>
)}
</div>
);
}Complete working example
Here is a full implementation combining all the pieces with proper error handling:
// App.tsx
import { useState } from 'react';
import { initializeApp } from 'firebase/app';
import { getFirestore, writeBatch, doc, collection, Firestore } from 'firebase/firestore';
import Papa from 'papaparse';
// Initialize Firebase
const firebaseConfig = {
apiKey: process.env.REACT_APP_FIREBASE_API_KEY,
authDomain: process.env.REACT_APP_FIREBASE_AUTH_DOMAIN,
projectId: process.env.REACT_APP_FIREBASE_PROJECT_ID,
storageBucket: process.env.REACT_APP_FIREBASE_STORAGE_BUCKET,
messagingSenderId: process.env.REACT_APP_FIREBASE_MESSAGING_SENDER_ID,
appId: process.env.REACT_APP_FIREBASE_APP_ID,
};
const app = initializeApp(firebaseConfig);
const db = getFirestore(app);
// Types
interface ImportResult {
success: number;
failed: number;
errors: string[];
}
// CSV Parser with type conversion
function parseCSVData(csvText: string): Record<string, unknown>[] {
const result = Papa.parse(csvText, {
header: true,
dynamicTyping: true,
skipEmptyLines: true,
transformHeader: (header) => header.trim().toLowerCase().replace(/\s+/g, '_'),
});
if (result.errors.length > 0) {
throw new Error(`CSV parse error: ${result.errors[0].message}`);
}
return result.data as Record<string, unknown>[];
}
// Batch import function
async function batchImport(
db: Firestore,
data: Record<string, unknown>[],
collectionName: string,
onProgress?: (current: number, total: number) => void
): Promise<ImportResult> {
const BATCH_SIZE = 500;
const result: ImportResult = { success: 0, failed: 0, errors: [] };
for (let i = 0; i < data.length; i += BATCH_SIZE) {
const batch = writeBatch(db);
const chunk = data.slice(i, i + BATCH_SIZE);
chunk.forEach((record) => {
const docRef = doc(collection(db, collectionName));
batch.set(docRef, {
...record,
_importedAt: new Date().toISOString(),
});
});
try {
await batch.commit();
result.success += chunk.length;
} catch (error) {
result.failed += chunk.length;
result.errors.push(
`Batch ${Math.floor(i / BATCH_SIZE) + 1} failed: ${
error instanceof Error ? error.message : 'Unknown error'
}`
);
}
onProgress?.(Math.min(i + BATCH_SIZE, data.length), data.length);
}
return result;
}
// Main component
export default function FirebaseCSVImporter() {
const [collection, setCollection] = useState('products');
const [importing, setImporting] = useState(false);
const [progress, setProgress] = useState({ current: 0, total: 0 });
const [result, setResult] = useState<ImportResult | null>(null);
const [error, setError] = useState<string | null>(null);
async function handleImport(event: React.ChangeEvent<HTMLInputElement>) {
const file = event.target.files?.[0];
if (!file) return;
setImporting(true);
setError(null);
setResult(null);
try {
const csvText = await file.text();
const data = parseCSVData(csvText);
if (data.length === 0) {
throw new Error('CSV file is empty or has no valid data rows');
}
setProgress({ current: 0, total: data.length });
const importResult = await batchImport(db, data, collection, (current, total) => {
setProgress({ current, total });
});
setResult(importResult);
} catch (err) {
setError(err instanceof Error ? err.message : 'Import failed');
} finally {
setImporting(false);
}
}
return (
<div style={{ padding: '2rem', maxWidth: '600px' }}>
<h1>Firebase CSV Import</h1>
<div style={{ marginBottom: '1rem' }}>
<label htmlFor="collection">Collection name:</label>
<input
id="collection"
type="text"
value={collection}
onChange={(e) => setCollection(e.target.value)}
disabled={importing}
style={{ marginLeft: '0.5rem', padding: '0.25rem' }}
/>
</div>
<input
type="file"
accept=".csv"
onChange={handleImport}
disabled={importing || !collection}
/>
{importing && (
<div style={{ marginTop: '1rem' }}>
<p>
Importing... {progress.current} / {progress.total}
</p>
<progress value={progress.current} max={progress.total} style={{ width: '100%' }} />
</div>
)}
{result && (
<div style={{ marginTop: '1rem', padding: '1rem', background: '#f0f0f0' }}>
<p>Import complete:</p>
<ul>
<li>Success: {result.success}</li>
<li>Failed: {result.failed}</li>
</ul>
{result.errors.length > 0 && (
<details>
<summary>Errors ({result.errors.length})</summary>
<ul>
{result.errors.map((err, i) => (
<li key={i}>{err}</li>
))}
</ul>
</details>
)}
</div>
)}
{error && (
<div style={{ marginTop: '1rem', padding: '1rem', background: '#fee', color: '#c00' }}>
{error}
</div>
)}
</div>
);
}Server-side import with Admin SDK
For server-side imports (useful for larger files or automated pipelines), use the Firebase Admin SDK:
// server-import.ts
import * as admin from 'firebase-admin';
import * as fs from 'fs';
import Papa from 'papaparse';
// Initialize Admin SDK
admin.initializeApp({
credential: admin.credential.cert('./service-account.json'),
});
const db = admin.firestore();
async function importCSVFromFile(
filePath: string,
collectionName: string
): Promise<void> {
const csvText = fs.readFileSync(filePath, 'utf-8');
const { data, errors } = Papa.parse(csvText, {
header: true,
dynamicTyping: true,
skipEmptyLines: true,
});
if (errors.length > 0) {
throw new Error(`Parse error: ${errors[0].message}`);
}
const BATCH_SIZE = 500;
let imported = 0;
for (let i = 0; i < data.length; i += BATCH_SIZE) {
const batch = db.batch();
const chunk = data.slice(i, i + BATCH_SIZE);
chunk.forEach((record: Record<string, unknown>) => {
const docRef = db.collection(collectionName).doc();
batch.set(docRef, {
...record,
_importedAt: admin.firestore.FieldValue.serverTimestamp(),
});
});
await batch.commit();
imported += chunk.length;
console.log(`Imported ${imported} of ${data.length} documents`);
}
console.log('Import complete');
}
// Usage
importCSVFromFile('./data/products.csv', 'products')
.then(() => process.exit(0))
.catch((err) => {
console.error(err);
process.exit(1);
});Troubleshooting
PERMISSION_DENIED error
This error typically indicates a Firestore security rules issue or invalid credentials.
For client-side imports:
- Check your Firestore security rules allow write access
- Ensure the user is authenticated if rules require it
For server-side imports:
- Verify the service account JSON file path is correct
- Confirm the service account has appropriate Firestore access roles
RESOURCE_EXHAUSTED error
You are hitting Firestore rate limits. Add delays between batches:
async function importWithDelay(db: Firestore, data: Record<string, unknown>[]) {
const BATCH_SIZE = 500;
const DELAY_MS = 1000; // 1 second between batches
for (let i = 0; i < data.length; i += BATCH_SIZE) {
const batch = writeBatch(db);
// ... batch operations
await batch.commit();
if (i + BATCH_SIZE < data.length) {
await new Promise((resolve) => setTimeout(resolve, DELAY_MS));
}
}
}INVALID_ARGUMENT error
Document data is invalid. Common causes:
- Document exceeds 1 MiB size limit
- Field names contain invalid characters
- Nested objects exceed 20 levels of depth
Add validation before import:
function validateDocument(doc: Record<string, unknown>): boolean {
const jsonSize = new TextEncoder().encode(JSON.stringify(doc)).length;
return jsonSize < 1_000_000; // 1 MiB limit
}Handling date fields
CSV dates are strings. Convert them during parsing:
function convertDates(
data: Record<string, unknown>[],
dateFields: string[]
): Record<string, unknown>[] {
return data.map((row) => {
const converted = { ...row };
dateFields.forEach((field) => {
if (converted[field] && typeof converted[field] === 'string') {
const date = new Date(converted[field] as string);
if (!isNaN(date.getTime())) {
converted[field] = date;
}
}
});
return converted;
});
}Simplify with ImportCSV
Building a CSV import feature from scratch requires handling parsing, validation, type conversion, batch management, and error handling. ImportCSV provides a pre-built React component that handles these concerns:
- Automatic data type detection for numbers, booleans, and dates
- Built-in validation with custom rules
- Progress tracking for large files
- Drag-and-drop file upload interface
- Column mapping for matching CSV headers to your schema
This lets you focus on your Firestore integration logic rather than building CSV handling from scratch.
Next steps
- Read the Firestore documentation on batched writes
- Add data validation before import using a schema library like Zod
- Implement duplicate detection based on unique fields
- Explore ImportCSV documentation
Related posts
Wrap-up
CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.
If that sounds like the kind of tooling you want to use, try ImportCSV .