Blog
January 16, 2026

Import CSV to Google Sheets Programmatically (Node.js & Python)

Automate CSV imports to Google Sheets using the Sheets API. Complete Node.js and Python tutorials with authentication, batch updates, and error handling.

9 mins read

Import CSV to Google Sheets programmatically

Most "import CSV to Google Sheets" guides show you File → Import → Upload. That's fine for manual one-offs. But when you need to automate it—sync daily reports, push data from your app, build a pipeline—you need the Sheets API.

This tutorial shows you how to programmatically import CSV data to Google Sheets using Node.js and Python. We'll cover service account authentication, batch updates for large files, and the gotchas that aren't obvious from the docs.

Method Overview

MethodBest ForComplexity
Manual File → ImportOne-off uploadsEasy
IMPORTDATA functionPublic URL-based CSVsEasy
Apps ScriptScheduled Drive importsMedium
Google Sheets APIExternal automation, pipelinesComplex

When to use the API: When CSV data comes from outside Google's ecosystem—your database, a third-party service, a cron job. The API gives you programmatic control that the other methods don't.

Prerequisites

  • Node.js 18+ or Python 3.8+
  • Google Cloud project (free tier works)
  • A Google Sheet to import to

Step 1: Google Cloud Setup

First, create a service account. This lets your code authenticate without user interaction.

Create a Google Cloud Project

  1. Go to console.cloud.google.com
  2. Create a new project (or select existing)
  3. Note your project ID

Enable the Sheets API

  1. Navigate to APIs & Services → Library
  2. Search "Google Sheets API"
  3. Click Enable

Create Service Account

  1. Go to APIs & Services → Credentials
  2. Click "Create Credentials" → "Service Account"
  3. Name it something descriptive (e.g., "csv-importer")
  4. Grant no roles for now (not needed for Sheets)
  5. Click through to create

Download Credentials

  1. Click your new service account
  2. Go to Keys tab
  3. Add Key → Create new key → JSON
  4. Save the downloaded JSON file as service-account.json

Keep this file secure. It's essentially a password. Add it to .gitignore.

Share the Spreadsheet

The service account email looks like: csv-importer@your-project.iam.gserviceaccount.com

Share your target spreadsheet with this email (Editor access). Without this step, you'll get a 403 error.

Step 2: Node.js Implementation

Install the Google APIs client:

npm install googleapis papaparse
npm install -D @types/node

Basic Import

// import-csv.ts
import { google, sheets_v4 } from 'googleapis';
import Papa from 'papaparse';
import { readFileSync } from 'fs';

const SCOPES = ['https://www.googleapis.com/auth/spreadsheets'];
const CREDENTIALS_PATH = './service-account.json';

async function getAuthClient() {
  const auth = new google.auth.GoogleAuth({
    keyFile: CREDENTIALS_PATH,
    scopes: SCOPES,
  });
  return auth.getClient();
}

async function importCSVToSheets(
  csvPath: string,
  spreadsheetId: string,
  sheetName: string = 'Sheet1'
) {
  const auth = await getAuthClient();
  const sheets = google.sheets({ version: 'v4', auth: auth as any });

  // Parse CSV
  const csvContent = readFileSync(csvPath, 'utf-8');
  const { data } = Papa.parse(csvContent, {
    header: false,  // Keep as 2D array for Sheets API
    skipEmptyLines: true,
  });

  // Update spreadsheet
  const range = `${sheetName}!A1`;

  const response = await sheets.spreadsheets.values.update({
    spreadsheetId,
    range,
    valueInputOption: 'RAW',
    requestBody: {
      values: data as string[][],
    },
  });

  console.log(`Updated ${response.data.updatedCells} cells`);
  return response.data;
}

// Usage
importCSVToSheets(
  './data.csv',
  'YOUR_SPREADSHEET_ID',  // From the URL
  'Sheet1'
);

The spreadsheet ID is in the URL: docs.google.com/spreadsheets/d/{SPREADSHEET_ID}/edit

Handling Large Files (Batch Updates)

Google Sheets API has limits:

  • 10 million cells per spreadsheet
  • Up to 100 requests per batch

For files with more than 50,000 rows, batch your updates:

async function importLargeCSV(
  csvPath: string,
  spreadsheetId: string,
  sheetName: string = 'Sheet1'
) {
  const auth = await getAuthClient();
  const sheets = google.sheets({ version: 'v4', auth: auth as any });

  const csvContent = readFileSync(csvPath, 'utf-8');
  const { data } = Papa.parse(csvContent, {
    header: false,
    skipEmptyLines: true,
  });

  const rows = data as string[][];
  const BATCH_SIZE = 10000;
  let updatedCells = 0;

  for (let i = 0; i < rows.length; i += BATCH_SIZE) {
    const batch = rows.slice(i, i + BATCH_SIZE);
    const startRow = i + 1;  // 1-indexed for Sheets

    await sheets.spreadsheets.values.update({
      spreadsheetId,
      range: `${sheetName}!A${startRow}`,
      valueInputOption: 'RAW',
      requestBody: {
        values: batch,
      },
    });

    updatedCells += batch.length * (batch[0]?.length || 0);
    console.log(`Processed rows ${i + 1} to ${i + batch.length}`);

    // Rate limiting: 300 write requests per minute
    if (i + BATCH_SIZE < rows.length) {
      await new Promise(resolve => setTimeout(resolve, 250));
    }
  }

  console.log(`Total cells updated: ${updatedCells}`);
}

Clearing Existing Data First

To replace sheet contents instead of overwriting:

async function clearAndImport(
  csvPath: string,
  spreadsheetId: string,
  sheetName: string
) {
  const auth = await getAuthClient();
  const sheets = google.sheets({ version: 'v4', auth: auth as any });

  // Clear existing content
  await sheets.spreadsheets.values.clear({
    spreadsheetId,
    range: `${sheetName}`,
  });

  // Then import
  // ... (same as above)
}

Step 3: Python Implementation

Install dependencies:

pip install google-api-python-client google-auth

Basic Import

# import_csv.py
from google.oauth2 import service_account
from googleapiclient.discovery import build
import csv

SCOPES = ['https://www.googleapis.com/auth/spreadsheets']
SERVICE_ACCOUNT_FILE = 'service-account.json'

def get_sheets_service():
    credentials = service_account.Credentials.from_service_account_file(
        SERVICE_ACCOUNT_FILE, scopes=SCOPES
    )
    return build('sheets', 'v4', credentials=credentials)

def import_csv_to_sheets(csv_path: str, spreadsheet_id: str, sheet_name: str = 'Sheet1'):
    service = get_sheets_service()

    # Read CSV
    with open(csv_path, 'r', encoding='utf-8') as f:
        reader = csv.reader(f)
        data = list(reader)

    # Update spreadsheet
    range_name = f'{sheet_name}!A1'

    result = service.spreadsheets().values().update(
        spreadsheetId=spreadsheet_id,
        range=range_name,
        valueInputOption='RAW',
        body={'values': data}
    ).execute()

    print(f"Updated {result.get('updatedCells')} cells")
    return result

# Usage
if __name__ == '__main__':
    import_csv_to_sheets(
        'data.csv',
        'YOUR_SPREADSHEET_ID',
        'Sheet1'
    )

Batch Updates for Large Files

def import_large_csv(csv_path: str, spreadsheet_id: str, sheet_name: str = 'Sheet1'):
    service = get_sheets_service()
    BATCH_SIZE = 10000

    with open(csv_path, 'r', encoding='utf-8') as f:
        reader = csv.reader(f)
        rows = list(reader)

    total_cells = 0

    for i in range(0, len(rows), BATCH_SIZE):
        batch = rows[i:i + BATCH_SIZE]
        start_row = i + 1

        service.spreadsheets().values().update(
            spreadsheetId=spreadsheet_id,
            range=f'{sheet_name}!A{start_row}',
            valueInputOption='RAW',
            body={'values': batch}
        ).execute()

        total_cells += len(batch) * len(batch[0]) if batch else 0
        print(f"Processed rows {i + 1} to {i + len(batch)}")

    print(f"Total cells: {total_cells}")

Step 4: Apps Script Alternative

For imports from Google Drive (not external sources), Apps Script is simpler:

// In Google Sheets → Extensions → Apps Script

function importCSVFromDrive() {
  const folderId = 'YOUR_DRIVE_FOLDER_ID';
  const folder = DriveApp.getFolderById(folderId);
  const files = folder.getFilesByType(MimeType.CSV);

  const sheet = SpreadsheetApp.getActiveSheet();

  while (files.hasNext()) {
    const file = files.next();
    const content = file.getBlob().getDataAsString();
    const rows = Utilities.parseCsv(content);

    sheet.getRange(1, 1, rows.length, rows[0].length).setValues(rows);

    // Move processed file
    file.moveTo(DriveApp.getFolderById('PROCESSED_FOLDER_ID'));
  }
}

// Set up time-driven trigger in Apps Script UI

This works for scheduled imports from Drive. For external data, use the API approach.

Step 5: The IMPORTDATA Function

For public CSV URLs, no code needed:

=IMPORTDATA("https://example.com/data.csv")

Limitations:

  • URL must be publicly accessible
  • 50 IMPORTDATA functions per spreadsheet
  • No authentication support
  • No error handling
  • Refreshes automatically (can't control timing)

Good for public datasets. Not for authenticated APIs or local files.

Common Issues and Fixes

"The caller does not have permission" (403)

The service account email doesn't have access to the spreadsheet. Share the sheet with the service account email as Editor.

"Quota exceeded"

Google Sheets API allows 300 write requests per minute per project. Add delays between requests:

await new Promise(resolve => setTimeout(resolve, 250));  // 200ms = 240/min max

"Invalid range"

Sheet name with spaces needs quotes: 'Sheet Name'!A1

Or use the sheet's GID instead: #gid=0

Character encoding issues

Force UTF-8 when reading:

const csvContent = readFileSync(csvPath, { encoding: 'utf-8' });
with open(csv_path, 'r', encoding='utf-8-sig') as f:  # handles BOM

Dates formatted wrong

Sheets interprets some values as dates. Use RAW valueInputOption (as shown) or format dates explicitly:

valueInputOption: 'USER_ENTERED',  // Sheets parses values
// vs
valueInputOption: 'RAW',  // Values stored as-is

Production Considerations

Error Handling

Wrap API calls in try-catch and handle specific error codes:

try {
  await sheets.spreadsheets.values.update({...});
} catch (error: any) {
  if (error.code === 429) {
    // Rate limited - wait and retry
    await new Promise(r => setTimeout(r, 60000));
    // Retry...
  } else if (error.code === 403) {
    console.error('Permission denied - check sheet sharing');
  } else {
    throw error;
  }
}

Logging

Track imports for debugging:

console.log({
  timestamp: new Date().toISOString(),
  spreadsheetId,
  rowsImported: data.length,
  columnsImported: data[0]?.length || 0,
});

Scheduling

For daily imports, use:

  • Node.js: cron jobs, GitHub Actions, or cloud schedulers
  • Python: Same options
  • Apps Script: Built-in time-driven triggers

When to Use Something Else

This approach works well for developer-to-Sheets pipelines. But it falls apart when:

End users need to upload CSVs: Building a file upload UI, column mapping interface, and validation rules is a separate project entirely.

CSV formats vary: Different column names, missing fields, inconsistent formatting—the Sheets API doesn't help with this.

You need validation before import: The API just writes data. Bad rows go in alongside good ones.

For user-facing imports with validation and column mapping, ImportCSV handles that complexity. You get clean, validated data via webhook—no upload UI to build.

FAQ

Can I import to a specific sheet tab?

Yes. Use the sheet name in the range: 'Monthly Report'!A1 or the GID: #gid=123456.

How many cells can I update at once?

The API allows up to 10 million cells per spreadsheet total, and 100 requests per batch. For practical purposes, batch 10,000-50,000 rows at a time.

Can I append instead of overwrite?

Use spreadsheets.values.append instead of update:

await sheets.spreadsheets.values.append({
  spreadsheetId,
  range: 'Sheet1!A1',
  valueInputOption: 'RAW',
  requestBody: { values: newRows },
});

Does this work with Google Workspace accounts?

Yes, but your organization might restrict API access. Check with your Workspace admin if you get permission errors.


The Sheets API gives you programmatic control that manual imports and IMPORTDATA can't match. The setup is more involved, but once it's working, you can automate imports from anywhere.

Wrap-up

CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.

If that sounds like the kind of tooling you want to use, try ImportCSV .