How to add CSV import to your React app

Your users have data in spreadsheets. They need to get it into your app. Building CSV import from scratch sounds straightforward until you hit encoding issues, malformed files, and the realization that your users' column names never match your database schema.
This tutorial covers three approaches to CSV import in React, from low-level parsing to full-featured UI components. Pick the one that matches your needs.
Prerequisites
- Node.js 18+
- React 18+ or Next.js 14+
- Basic TypeScript knowledge
What you'll build
A CSV import component that:
- Accepts file uploads via click or drag-drop
- Parses CSV data with proper encoding handling
- Validates required fields
- Maps user columns to your expected schema
Approach 1: PapaParse (full control)
PapaParse is the most popular CSV parsing library for JavaScript, with 5.4 million weekly downloads. Use this when you want complete control over the UI and parsing logic.
Installation
npm install papaparse
npm install --save-dev @types/papaparseBasic file input component
'use client'; // Required for Next.js App Router
import { useState, type ChangeEvent } from 'react';
import Papa from 'papaparse';
interface ParsedData {
[key: string]: string;
}
export function CSVUploader() {
const [data, setData] = useState<ParsedData[]>([]);
const [error, setError] = useState<string | null>(null);
const handleFileChange = (e: ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
// Validate file type
if (!file.name.endsWith('.csv')) {
setError('Please upload a CSV file');
return;
}
Papa.parse<ParsedData>(file, {
header: true,
encoding: 'UTF-8',
complete: (results) => {
if (results.data.length === 0) {
setError('The file appears to be empty');
return;
}
setData(results.data);
setError(null);
},
error: (err) => {
setError(`Parse error: ${err.message}`);
},
});
};
return (
<div>
<input
type="file"
accept=".csv"
onChange={handleFileChange}
/>
{error && <p style={{ color: 'red' }}>{error}</p>}
{data.length > 0 && (
<p>Loaded {data.length} rows</p>
)}
</div>
);
}The header: true option treats the first row as column names, returning an array of objects instead of arrays. The encoding: 'UTF-8' setting handles special characters like umlauts and accents.
Displaying parsed data
'use client';
import { useState, type ChangeEvent } from 'react';
import Papa from 'papaparse';
interface ParsedData {
[key: string]: string;
}
export function CSVTable() {
const [data, setData] = useState<ParsedData[]>([]);
const [columns, setColumns] = useState<string[]>([]);
const handleFileChange = (e: ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
Papa.parse<ParsedData>(file, {
header: true,
complete: (results) => {
if (results.data.length === 0) return;
// Extract column names from first row
const cols = Object.keys(results.data[0]);
setColumns(cols);
setData(results.data);
},
});
};
return (
<div>
<input type="file" accept=".csv" onChange={handleFileChange} />
{data.length > 0 && (
<table>
<thead>
<tr>
{columns.map((col) => (
<th key={col}>{col}</th>
))}
</tr>
</thead>
<tbody>
{data.map((row, i) => (
<tr key={i}>
{columns.map((col) => (
<td key={col}>{row[col]}</td>
))}
</tr>
))}
</tbody>
</table>
)}
</div>
);
}When to use PapaParse directly
- You need full control over the UI
- You're building a custom workflow that doesn't fit pre-built components
- You need server-side parsing (PapaParse works in Node.js)
- You want minimal dependencies
Approach 2: react-papaparse (React hooks)
react-papaparse wraps PapaParse with React-specific APIs like hooks and render props. It has 161K weekly downloads and includes built-in TypeScript support.
Installation
npm install react-papaparseHook-based uploader with drag-drop
'use client';
import { useCSVReader } from 'react-papaparse';
interface CSVResult {
data: string[][];
errors: Array<{ message: string }>;
meta: { delimiter: string };
}
export function DragDropUploader() {
const { CSVReader } = useCSVReader();
const handleUpload = (results: CSVResult) => {
console.log('Parsed data:', results.data);
console.log('Delimiter detected:', results.meta.delimiter);
if (results.errors.length > 0) {
console.error('Errors:', results.errors);
}
};
return (
<CSVReader onUploadAccepted={handleUpload}>
{({
getRootProps,
acceptedFile,
ProgressBar,
getRemoveFileProps,
}: {
getRootProps: () => Record<string, unknown>;
acceptedFile: File | null;
ProgressBar: React.ComponentType;
getRemoveFileProps: () => Record<string, unknown>;
}) => (
<div
{...getRootProps()}
style={{
border: '2px dashed #ccc',
padding: '2rem',
textAlign: 'center',
cursor: 'pointer',
}}
>
{acceptedFile ? (
<div>
<p>{acceptedFile.name}</p>
<ProgressBar />
<button {...getRemoveFileProps()}>Remove</button>
</div>
) : (
<p>Drop a CSV file here, or click to select</p>
)}
</div>
)}
</CSVReader>
);
}Important: react-papaparse and SSR
react-papaparse only works in the browser. If you're using Next.js App Router, you must add the 'use client' directive at the top of your component file. For Pages Router, use dynamic import:
import dynamic from 'next/dynamic';
const DragDropUploader = dynamic(
() => import('./DragDropUploader').then((mod) => mod.DragDropUploader),
{ ssr: false }
);When to use react-papaparse
- You want drag-drop functionality without building it yourself
- You prefer React hooks over imperative APIs
- You're building a client-side only application
- You want built-in progress indicators
Approach 3: react-csv-importer (full UI with column mapping)
react-csv-importer provides a complete import wizard with column mapping UI. Users can match their CSV columns to your expected fields without you writing any mapping code. It has 15K weekly downloads and is compatible with React 18+.
Installation
npm install react-csv-importerBasic usage with field definitions
'use client';
import { Importer, ImporterField } from 'react-csv-importer';
import 'react-csv-importer/dist/index.css';
interface ContactRow {
email: string;
name: string;
phone?: string;
}
export function ContactImporter() {
const handleData = async (rows: ContactRow[]) => {
// Process rows in batches (the library handles chunking)
for (const row of rows) {
console.log('Processing:', row);
// await saveToDatabase(row);
}
};
return (
<Importer<ContactRow>
dataHandler={handleData}
onComplete={() => {
console.log('Import complete');
}}
onClose={() => {
console.log('Wizard closed');
}}
>
<ImporterField name="email" label="Email Address" />
<ImporterField name="name" label="Full Name" />
<ImporterField name="phone" label="Phone Number" optional />
</Importer>
);
}The ImporterField components define your expected schema. The library shows users a UI to match their CSV columns (which might be named "E-mail", "email_address", or "Contact Email") to your fields.
Handling large files
react-csv-importer processes files in chunks, calling your dataHandler multiple times with batches of rows. This prevents memory issues with large files:
<Importer<ContactRow>
chunkSize={1000} // Process 1000 rows at a time
dataHandler={async (rows, { startIndex }) => {
console.log(`Processing rows ${startIndex} to ${startIndex + rows.length}`);
await batchInsert(rows);
}}
>
{/* fields */}
</Importer>When to use react-csv-importer
- Your users have CSVs with unpredictable column names
- You need column mapping without building the UI yourself
- You want a step-by-step wizard experience
- You're importing to a database with a fixed schema
Handling large files with streaming
When parsing files over a few megabytes, loading everything into memory can crash the browser tab. PapaParse supports streaming with the step callback:
'use client';
import { useState, type ChangeEvent } from 'react';
import Papa from 'papaparse';
interface RowData {
[key: string]: string;
}
export function LargeFileUploader() {
const [progress, setProgress] = useState(0);
const [rowCount, setRowCount] = useState(0);
const handleFileChange = (e: ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
let processedRows = 0;
Papa.parse<RowData>(file, {
header: true,
step: (row, parser) => {
processedRows++;
// Update progress every 100 rows to avoid UI lag
if (processedRows % 100 === 0) {
const pct = Math.round((processedRows / (file.size / 100)) * 100);
setProgress(Math.min(pct, 99));
setRowCount(processedRows);
}
// Process the row
console.log('Row:', row.data);
// You can pause/resume if needed
// parser.pause();
// parser.resume();
},
complete: () => {
setProgress(100);
setRowCount(processedRows);
console.log(`Finished processing ${processedRows} rows`);
},
error: (err) => {
console.error('Parse error:', err);
},
});
};
return (
<div>
<input type="file" accept=".csv" onChange={handleFileChange} />
{progress > 0 && (
<div>
<progress value={progress} max={100} />
<p>{rowCount} rows processed ({progress}%)</p>
</div>
)}
</div>
);
}The step callback receives one row at a time, keeping memory usage constant regardless of file size.
Validation and error handling
Parsed data needs validation before saving. Here's a pattern for validating required fields and formats:
interface ValidationError {
row: number;
field: string;
message: string;
}
interface ContactRow {
email: string;
name: string;
phone?: string;
}
function validateRow(row: ContactRow, index: number): ValidationError[] {
const errors: ValidationError[] = [];
// Required field check
if (!row.email || row.email.trim() === '') {
errors.push({
row: index + 1,
field: 'email',
message: 'Email is required',
});
}
// Email format validation
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (row.email && !emailRegex.test(row.email)) {
errors.push({
row: index + 1,
field: 'email',
message: 'Invalid email format',
});
}
// Required field check
if (!row.name || row.name.trim() === '') {
errors.push({
row: index + 1,
field: 'name',
message: 'Name is required',
});
}
return errors;
}
function validateCSV(data: ContactRow[]): ValidationError[] {
return data.flatMap((row, index) => validateRow(row, index));
}Usage after parsing:
Papa.parse<ContactRow>(file, {
header: true,
complete: (results) => {
const errors = validateCSV(results.data);
if (errors.length > 0) {
console.log('Validation errors:', errors);
// Show errors to user before proceeding
return;
}
// All rows valid, proceed with import
saveData(results.data);
},
});Common pitfalls
Encoding issues with special characters
European CSVs often contain characters like umlauts (a, o, u) or accents (e, n). These can appear corrupted if encoding is wrong.
Papa.parse(file, {
encoding: 'UTF-8', // Explicitly set encoding
// ...
});If files still appear corrupted, the source file might be in a different encoding like ISO-8859-1. Consider offering encoding selection to users or using a library like chardet to auto-detect.
Delimiter detection failures
Not all CSVs use commas. European files often use semicolons, and tab-separated files (TSV) are common. PapaParse auto-detects delimiters, but you can be explicit:
Papa.parse(file, {
delimiter: '', // Empty string triggers auto-detection
delimitersToGuess: [',', ';', '\t', '|'],
// ...
});Empty file handling
Always check for empty results before processing:
Papa.parse(file, {
header: true,
complete: (results) => {
if (!results.data || results.data.length === 0) {
setError('The file appears to be empty');
return;
}
// Proceed with data
},
});SSR and hydration errors
If you see errors like "window is not defined" or hydration mismatches in Next.js, your CSV component is running on the server. Add 'use client' directive for App Router or use dynamic imports with ssr: false for Pages Router.
Complete example
Here's a full working component that combines file upload, parsing, validation, and data display:
'use client';
import { useState, type ChangeEvent } from 'react';
import Papa from 'papaparse';
interface ContactRow {
email: string;
name: string;
company?: string;
}
interface ValidationError {
row: number;
field: string;
message: string;
}
type ImportStatus = 'idle' | 'parsing' | 'validating' | 'ready' | 'error';
export function ContactCSVImporter() {
const [data, setData] = useState<ContactRow[]>([]);
const [errors, setErrors] = useState<ValidationError[]>([]);
const [status, setStatus] = useState<ImportStatus>('idle');
const [fileName, setFileName] = useState<string>('');
const validateRow = (row: ContactRow, index: number): ValidationError[] => {
const rowErrors: ValidationError[] = [];
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (!row.email?.trim()) {
rowErrors.push({ row: index + 1, field: 'email', message: 'Required' });
} else if (!emailRegex.test(row.email)) {
rowErrors.push({ row: index + 1, field: 'email', message: 'Invalid format' });
}
if (!row.name?.trim()) {
rowErrors.push({ row: index + 1, field: 'name', message: 'Required' });
}
return rowErrors;
};
const handleFileChange = (e: ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
setFileName(file.name);
setStatus('parsing');
setErrors([]);
Papa.parse<ContactRow>(file, {
header: true,
encoding: 'UTF-8',
skipEmptyLines: true,
complete: (results) => {
if (results.data.length === 0) {
setStatus('error');
setErrors([{ row: 0, field: 'file', message: 'File is empty' }]);
return;
}
setStatus('validating');
const validationErrors = results.data.flatMap((row, i) =>
validateRow(row, i)
);
if (validationErrors.length > 0) {
setErrors(validationErrors);
setStatus('error');
} else {
setData(results.data);
setStatus('ready');
}
},
error: (err) => {
setStatus('error');
setErrors([{ row: 0, field: 'file', message: err.message }]);
},
});
};
const handleImport = () => {
console.log('Importing', data.length, 'contacts');
// Send to your API
};
return (
<div style={{ maxWidth: '800px', margin: '0 auto', padding: '2rem' }}>
<h2>Import Contacts</h2>
<div style={{ marginBottom: '1rem' }}>
<input
type="file"
accept=".csv"
onChange={handleFileChange}
style={{ marginBottom: '0.5rem' }}
/>
<p style={{ fontSize: '0.875rem', color: '#666' }}>
Upload a CSV with email, name, and company columns
</p>
</div>
{status === 'parsing' && <p>Parsing file...</p>}
{status === 'validating' && <p>Validating data...</p>}
{errors.length > 0 && (
<div style={{
background: '#fee',
border: '1px solid #fcc',
padding: '1rem',
marginBottom: '1rem'
}}>
<strong>Errors found:</strong>
<ul>
{errors.slice(0, 10).map((err, i) => (
<li key={i}>
Row {err.row}: {err.field} - {err.message}
</li>
))}
{errors.length > 10 && (
<li>...and {errors.length - 10} more errors</li>
)}
</ul>
</div>
)}
{status === 'ready' && (
<div>
<p style={{ color: 'green' }}>
{data.length} contacts ready to import from {fileName}
</p>
<table style={{ width: '100%', borderCollapse: 'collapse' }}>
<thead>
<tr style={{ background: '#f5f5f5' }}>
<th style={{ padding: '0.5rem', textAlign: 'left' }}>Email</th>
<th style={{ padding: '0.5rem', textAlign: 'left' }}>Name</th>
<th style={{ padding: '0.5rem', textAlign: 'left' }}>Company</th>
</tr>
</thead>
<tbody>
{data.slice(0, 5).map((row, i) => (
<tr key={i} style={{ borderBottom: '1px solid #eee' }}>
<td style={{ padding: '0.5rem' }}>{row.email}</td>
<td style={{ padding: '0.5rem' }}>{row.name}</td>
<td style={{ padding: '0.5rem' }}>{row.company || '-'}</td>
</tr>
))}
</tbody>
</table>
{data.length > 5 && (
<p style={{ color: '#666' }}>Showing first 5 of {data.length} rows</p>
)}
<button
onClick={handleImport}
style={{
marginTop: '1rem',
padding: '0.75rem 1.5rem',
background: '#0070f3',
color: 'white',
border: 'none',
borderRadius: '4px',
cursor: 'pointer',
}}
>
Import {data.length} Contacts
</button>
</div>
)}
</div>
);
}Managed alternative: ImportCSV
Building CSV import from scratch gives you full control, but it also means maintaining validation logic, handling edge cases, and building column mapping UI yourself.
If you want a managed solution, ImportCSV handles column mapping, validation, and error handling out of the box:
import { CSVImporter } from '@importcsv/react';
<CSVImporter
onComplete={(data) => {
console.log('Imported:', data);
}}
/>Summary
Three approaches to CSV import in React:
- PapaParse: Full control, works server-side, requires building your own UI
- react-papaparse: React hooks with drag-drop, browser-only
- react-csv-importer: Complete UI with column mapping, best for complex imports
Choose based on how much control you need versus how much UI you want pre-built. For basic uploads with predictable column names, PapaParse is enough. For user-facing imports where column names vary, react-csv-importer or a managed solution saves significant development time.
Related posts
Wrap-up
CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.
If that sounds like the kind of tooling you want to use, try ImportCSV .