Blog
January 11, 2026

Airtable-style CSV import for your React app

Learn three methods to import CSV data into Airtable: native import, CSV Import Extension, and the Sync API. Includes complete code examples and rate limit handling.

10 mins read

Airtable-style CSV import for your React app

Importing CSV data into Airtable is straightforward through their UI, but building a programmatic CSV import for your React app requires understanding Airtable's API limits, batching requirements, and field type constraints. This guide covers all three methods available for CSV import and shows you how to build a robust import feature.

Prerequisites

  • Airtable account (Free tier works for testing, paid plans required for some features)
  • Node.js 18+
  • A Personal Access Token from Airtable with data.records:write scope

Three methods for Airtable CSV import

Airtable provides three distinct ways to import CSV data, each with different limits and use cases.

Method 1: Native CSV import (all plans)

The built-in import feature is available on all plan types through the web browser, Mac app, or Windows app. According to Airtable's documentation:

"The limit for the native import feature is 5 MB for both CSVs and Excel files."

RequirementDetails
File Size Limit5 MB
Plan AvailabilityAll plans
Permissions RequiredOwner or Creator
PlatformsWeb, Mac app, Windows app

This method works well for manual imports but cannot be automated through code.

Method 2: CSV Import Extension (paid plans)

For more control over imports, Airtable offers a CSV Import Extension available on all paid plans. From their documentation:

"The CSV import extension row limit is limited to 25,000, and CSV imports are limited to 5MB."

FeatureDetails
Row Limit25,000 rows
File Size Limit5 MB
Plan AvailabilityPaid plans only
Merge CapabilityYes - can merge with existing records
Field MappingAutomatic with manual override

The extension supports merging imported data with existing records, which helps avoid duplicates when you have a unique identifier field.

Method 3: Sync API (Pro and Enterprise)

For programmatic imports in a React application, the Sync API provides the most control. This endpoint is available on Pro, Enterprise (pre-2023.08), and Enterprise Scale plans.

FeatureDetails
Row Limit10,000 rows per request
Column Limit500 columns
Request Size2 MB max
Rate Limit20 requests per 5 minutes per base
Required Scopesdata.records:write, schema.bases:write

API Endpoint:

POST https://api.airtable.com/v0/{baseId}/{tableIdOrName}/sync/{apiEndpointSyncId}

Airtable API rate limits

Before writing any code, understand Airtable's rate limits to avoid hitting throttling:

PlanMonthly API CallsPer-Second Limit
Free1,000/month5 req/sec per base
Team100,000/month5 req/sec per base
BusinessUnlimited5 req/sec per base
EnterpriseUnlimited5 req/sec per base

All plans share the same 5 requests per second limit per base, so batching and rate limiting are essential regardless of your plan.

Step 1: Install dependencies

npm install airtable papaparse
npm install --save-dev @types/papaparse

The airtable npm package provides browser support through build/airtable.browser.js and has included Promise support since v0.5.0.

Step 2: Configure the Airtable client

import Airtable from 'airtable';

const base = new Airtable({
  apiKey: process.env.AIRTABLE_API_KEY,
}).base(process.env.AIRTABLE_BASE_ID!);

Store your API key securely in environment variables. Never expose it in client-side code.

Step 3: Parse CSV and batch records

Airtable's API accepts a maximum of 10 records per create request. Here's how to parse a CSV file and send records in batches while respecting rate limits:

import Airtable from 'airtable';
import Papa from 'papaparse';

const base = new Airtable({ apiKey: process.env.AIRTABLE_API_KEY }).base(
  process.env.AIRTABLE_BASE_ID!,
);

interface ImportResult {
  success: number;
  failed: number;
  errors: string[];
}

async function importCSVToAirtable(
  csvString: string,
  tableName: string,
): Promise<ImportResult> {
  const BATCH_SIZE = 10;
  const RATE_LIMIT_DELAY = 200; // 5 req/sec = 200ms between requests

  const result: ImportResult = {
    success: 0,
    failed: 0,
    errors: [],
  };

  // Parse CSV
  const parsed = Papa.parse<Record<string, string>>(csvString, {
    header: true,
    skipEmptyLines: true,
  });

  if (parsed.errors.length > 0) {
    result.errors = parsed.errors.map((e) => e.message);
    return result;
  }

  const records = parsed.data;

  // Process in batches of 10
  for (let i = 0; i < records.length; i += BATCH_SIZE) {
    const batch = records.slice(i, i + BATCH_SIZE);

    try {
      await base(tableName).create(batch.map((record) => ({ fields: record })));
      result.success += batch.length;
    } catch (error) {
      result.failed += batch.length;
      result.errors.push(
        `Batch ${Math.floor(i / BATCH_SIZE) + 1}: ${error instanceof Error ? error.message : 'Unknown error'}`,
      );
    }

    // Rate limiting: stay under 5 req/sec
    if (i + BATCH_SIZE < records.length) {
      await new Promise((resolve) => setTimeout(resolve, RATE_LIMIT_DELAY));
    }
  }

  return result;
}

Step 4: Using the Sync API for larger imports

For imports exceeding the 10-record batch limit or when you need CSV-native handling, use the Sync API directly:

interface SyncAPIResponse {
  success: boolean;
  recordsCreated?: number;
  error?: string;
}

async function importWithSyncAPI(
  baseId: string,
  tableName: string,
  syncId: string,
  csvString: string,
  apiKey: string,
): Promise<SyncAPIResponse> {
  // Sync API limit: 2 MB per request
  if (new Blob([csvString]).size > 2 * 1024 * 1024) {
    return {
      success: false,
      error: 'CSV exceeds 2 MB Sync API limit',
    };
  }

  try {
    const response = await fetch(
      `https://api.airtable.com/v0/${baseId}/${tableName}/sync/${syncId}`,
      {
        method: 'POST',
        headers: {
          Authorization: `Bearer ${apiKey}`,
          'Content-Type': 'text/csv',
        },
        body: csvString,
      },
    );

    if (!response.ok) {
      const error = await response.json();
      return {
        success: false,
        error: error.error?.message || `HTTP ${response.status}`,
      };
    }

    const data = await response.json();
    return {
      success: true,
      recordsCreated: data.records?.length,
    };
  } catch (error) {
    return {
      success: false,
      error: error instanceof Error ? error.message : 'Network error',
    };
  }
}

Note: The Sync API has a rate limit of 20 requests per 5 minutes per base, separate from the standard API rate limits.

Complete working example

Here's a React component that handles CSV file upload and imports to Airtable:

import { useState, useCallback } from 'react';
import Airtable from 'airtable';
import Papa from 'papaparse';

interface ImportState {
  status: 'idle' | 'parsing' | 'importing' | 'complete' | 'error';
  progress: number;
  success: number;
  failed: number;
  errors: string[];
}

const BATCH_SIZE = 10;
const RATE_LIMIT_DELAY = 200;

export function AirtableCSVImporter({
  tableName,
  baseId,
  apiKey,
}: {
  tableName: string;
  baseId: string;
  apiKey: string;
}) {
  const [state, setState] = useState<ImportState>({
    status: 'idle',
    progress: 0,
    success: 0,
    failed: 0,
    errors: [],
  });

  const handleFileUpload = useCallback(
    async (event: React.ChangeEvent<HTMLInputElement>) => {
      const file = event.target.files?.[0];
      if (!file) return;

      // Check file size (5 MB limit for standard import)
      if (file.size > 5 * 1024 * 1024) {
        setState((prev) => ({
          ...prev,
          status: 'error',
          errors: ['File exceeds 5 MB limit'],
        }));
        return;
      }

      setState((prev) => ({ ...prev, status: 'parsing' }));

      const text = await file.text();
      const parsed = Papa.parse<Record<string, string>>(text, {
        header: true,
        skipEmptyLines: true,
      });

      if (parsed.errors.length > 0) {
        setState((prev) => ({
          ...prev,
          status: 'error',
          errors: parsed.errors.map((e) => e.message),
        }));
        return;
      }

      const records = parsed.data;
      const base = new Airtable({ apiKey }).base(baseId);

      setState((prev) => ({ ...prev, status: 'importing' }));

      let success = 0;
      let failed = 0;
      const errors: string[] = [];

      for (let i = 0; i < records.length; i += BATCH_SIZE) {
        const batch = records.slice(i, i + BATCH_SIZE);

        try {
          await base(tableName).create(
            batch.map((record) => ({ fields: record })),
          );
          success += batch.length;
        } catch (error) {
          failed += batch.length;
          errors.push(
            `Rows ${i + 1}-${i + batch.length}: ${error instanceof Error ? error.message : 'Unknown error'}`,
          );
        }

        setState((prev) => ({
          ...prev,
          progress: Math.round(((i + batch.length) / records.length) * 100),
          success,
          failed,
          errors,
        }));

        if (i + BATCH_SIZE < records.length) {
          await new Promise((resolve) => setTimeout(resolve, RATE_LIMIT_DELAY));
        }
      }

      setState((prev) => ({ ...prev, status: 'complete' }));
    },
    [tableName, baseId, apiKey],
  );

  return (
    <div>
      <input
        type="file"
        accept=".csv"
        onChange={handleFileUpload}
        disabled={state.status === 'parsing' || state.status === 'importing'}
      />

      {state.status === 'importing' && (
        <div>
          <progress value={state.progress} max={100} />
          <p>Importing... {state.progress}%</p>
        </div>
      )}

      {state.status === 'complete' && (
        <div>
          <p>
            Import complete: {state.success} succeeded, {state.failed} failed
          </p>
        </div>
      )}

      {state.errors.length > 0 && (
        <ul>
          {state.errors.map((error, i) => (
            <li key={i}>{error}</li>
          ))}
        </ul>
      )}
    </div>
  );
}

Unsupported field types

Certain Airtable field types cannot be populated through CSV import. According to Airtable's documentation, these include:

  • Attachment
  • Autonumber
  • Barcode
  • Button
  • Count
  • Created by
  • Created time
  • Formula
  • Last modified by
  • Last modified time
  • Long text (when "Enable rich text formatting" is on)
  • Rollup

Plan your table schema accordingly, or convert these fields temporarily before importing.

Troubleshooting

Date format issues

Problem: Dates import incorrectly or display in the wrong format.

Solution: Use ISO format (YYYY-MM-DD) in your CSV. If dates still import incorrectly, temporarily convert the date field to "Single line text", import the data, then convert back to a date field. This workaround comes from the Airtable community forums.

Duplicate records during merge

Problem: Using the CSV Import Extension's merge feature creates duplicates instead of updating existing records.

Solution: Matching is case-sensitive. "email@example.com" and "Email@example.com" are treated as different values. Normalize your data before import, and use a truly unique identifier field as your merge key.

Cannot import to synced tables

Problem: Import fails on tables that sync from external sources.

Solution: You cannot import directly into synced tables. Instead, import into the non-synced source table.

File size exceeds limits

Problem: Your CSV is larger than the 5 MB limit.

Solutions:

  • Split the CSV into multiple smaller files
  • Use the Sync API which accepts 10,000 rows per request (but 2 MB per request)
  • For very large datasets, implement chunked uploads with rate limiting

Empty columns causing errors

Problem: Empty cells in your CSV cause import failures.

Solution: In the CSV Import Extension, enable the "Skip blank or invalid CSV values" option.

When to use ImportCSV instead

Building a production-ready CSV import means handling edge cases: large files, validation, column mapping, and error recovery. ImportCSV provides these features out of the box:

  • No file size limits: ImportCSV handles large files that exceed Airtable's 5 MB limit
  • Built-in validation UI: Users can review and fix data before it reaches your database
  • Visual column mapping: Drag-and-drop interface for mapping CSV columns to your schema
  • Graceful error handling: Invalid rows are flagged individually without failing the entire import
  • Automatic rate limiting: Batching and delays are handled for you
  • Backend flexibility: Works with Airtable, Supabase, Firebase, or any custom API

If you need a CSV import feature that handles these complexities, ImportCSV provides a drop-in React component that manages the entire flow.

Next steps

  • Review Airtable's API documentation for additional endpoints
  • Test with the Free tier before scaling to paid plans
  • Consider field type limitations when designing your import schema
  • Explore ImportCSV docs for ready-made import components

Wrap-up

CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.

If that sounds like the kind of tooling you want to use, try ImportCSV .