Import CSV to Google Sheets Programmatically (Node.js & Python)
Automate CSV imports to Google Sheets using the Sheets API. Complete Node.js and Python tutorials with authentication, batch updates, and error handling.

Most "import CSV to Google Sheets" guides show you File → Import → Upload. That's fine for a one-off. But when you need to automate it -- sync daily reports, push data from your app, build a pipeline -- you need the Sheets API.
Here's how to programmatically import CSV data to Google Sheets using Node.js and Python. Service account auth, batch updates for large files, and the gotchas that Google's docs don't warn you about.
Method Overview
| Method | Best For | Complexity |
|---|---|---|
| Manual File → Import | One-off uploads | Easy |
| IMPORTDATA function | Public URL-based CSVs | Easy |
| Apps Script | Scheduled Drive imports | Medium |
| Google Sheets API | External automation, pipelines | Complex |
When to use the API: When your CSV data comes from outside Google's ecosystem -- your database, a third-party service, a cron job. The other methods can't touch this.
Prerequisites
- Node.js 18+ or Python 3.8+
- Google Cloud project (free tier works)
- A Google Sheet to import to
Step 1: Google Cloud Setup
First, create a service account. This is the annoying part -- Google makes you jump through hoops -- but you only do it once.
Create a Google Cloud Project
- Go to console.cloud.google.com
- Create a new project (or select existing)
- Note your project ID
Enable the Sheets API
- Navigate to APIs & Services → Library
- Search "Google Sheets API"
- Click Enable
Create Service Account
- Go to APIs & Services → Credentials
- Click "Create Credentials" → "Service Account"
- Name it something descriptive (e.g., "csv-importer")
- Grant no roles for now (not needed for Sheets)
- Click through to create
Download Credentials
- Click your new service account
- Go to Keys tab
- Add Key → Create new key → JSON
- Save the downloaded JSON file as
service-account.json
Keep this file secure. It's basically a password. Add it to .gitignore immediately. I've seen people commit these to public repos. Don't be that person.
Share the Spreadsheet
The service account email looks like: csv-importer@your-project.iam.gserviceaccount.com
Share your target spreadsheet with this email (Editor access). Skip this step and you'll get a 403 error. Almost everyone hits this the first time.
Step 2: Node.js Implementation
Install the Google APIs client:
npm install googleapis papaparse
npm install -D @types/nodeBasic Import
// import-csv.ts
import { google, sheets_v4 } from 'googleapis';
import Papa from 'papaparse';
import { readFileSync } from 'fs';
const SCOPES = ['https://www.googleapis.com/auth/spreadsheets'];
const CREDENTIALS_PATH = './service-account.json';
async function getAuthClient() {
const auth = new google.auth.GoogleAuth({
keyFile: CREDENTIALS_PATH,
scopes: SCOPES,
});
return auth.getClient();
}
async function importCSVToSheets(
csvPath: string,
spreadsheetId: string,
sheetName: string = 'Sheet1'
) {
const auth = await getAuthClient();
const sheets = google.sheets({ version: 'v4', auth: auth as any });
// Parse CSV
const csvContent = readFileSync(csvPath, 'utf-8');
const { data } = Papa.parse(csvContent, {
header: false, // Keep as 2D array for Sheets API
skipEmptyLines: true,
});
// Update spreadsheet
const range = `${sheetName}!A1`;
const response = await sheets.spreadsheets.values.update({
spreadsheetId,
range,
valueInputOption: 'RAW',
requestBody: {
values: data as string[][],
},
});
console.log(`Updated ${response.data.updatedCells} cells`);
return response.data;
}
// Usage
importCSVToSheets(
'./data.csv',
'YOUR_SPREADSHEET_ID', // From the URL
'Sheet1'
);The spreadsheet ID is in the URL: docs.google.com/spreadsheets/d/{SPREADSHEET_ID}/edit
Handling Large Files (Batch Updates)
Google Sheets API has limits. The big ones:
- 10 million cells per spreadsheet
- Up to 100 requests per batch
If you're pushing more than 50,000 rows, you'll want to batch your updates. Otherwise it gets slow and flaky:
async function importLargeCSV(
csvPath: string,
spreadsheetId: string,
sheetName: string = 'Sheet1'
) {
const auth = await getAuthClient();
const sheets = google.sheets({ version: 'v4', auth: auth as any });
const csvContent = readFileSync(csvPath, 'utf-8');
const { data } = Papa.parse(csvContent, {
header: false,
skipEmptyLines: true,
});
const rows = data as string[][];
const BATCH_SIZE = 10000;
let updatedCells = 0;
for (let i = 0; i < rows.length; i += BATCH_SIZE) {
const batch = rows.slice(i, i + BATCH_SIZE);
const startRow = i + 1; // 1-indexed for Sheets
await sheets.spreadsheets.values.update({
spreadsheetId,
range: `${sheetName}!A${startRow}`,
valueInputOption: 'RAW',
requestBody: {
values: batch,
},
});
updatedCells += batch.length * (batch[0]?.length || 0);
console.log(`Processed rows ${i + 1} to ${i + batch.length}`);
// Rate limiting: 300 write requests per minute
if (i + BATCH_SIZE < rows.length) {
await new Promise(resolve => setTimeout(resolve, 250));
}
}
console.log(`Total cells updated: ${updatedCells}`);
}Clearing Existing Data First
One thing that trips people up: update overwrites cells starting at your range, but it doesn't clear anything beyond the new data. Old data can linger. Clear first:
async function clearAndImport(
csvPath: string,
spreadsheetId: string,
sheetName: string
) {
const auth = await getAuthClient();
const sheets = google.sheets({ version: 'v4', auth: auth as any });
// Clear existing content
await sheets.spreadsheets.values.clear({
spreadsheetId,
range: `${sheetName}`,
});
// Then import
// ... (same as above)
}Step 3: Python Implementation
Install dependencies:
pip install google-api-python-client google-authBasic Import
# import_csv.py
from google.oauth2 import service_account
from googleapiclient.discovery import build
import csv
SCOPES = ['https://www.googleapis.com/auth/spreadsheets']
SERVICE_ACCOUNT_FILE = 'service-account.json'
def get_sheets_service():
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES
)
return build('sheets', 'v4', credentials=credentials)
def import_csv_to_sheets(csv_path: str, spreadsheet_id: str, sheet_name: str = 'Sheet1'):
service = get_sheets_service()
# Read CSV
with open(csv_path, 'r', encoding='utf-8') as f:
reader = csv.reader(f)
data = list(reader)
# Update spreadsheet
range_name = f'{sheet_name}!A1'
result = service.spreadsheets().values().update(
spreadsheetId=spreadsheet_id,
range=range_name,
valueInputOption='RAW',
body={'values': data}
).execute()
print(f"Updated {result.get('updatedCells')} cells")
return result
# Usage
if __name__ == '__main__':
import_csv_to_sheets(
'data.csv',
'YOUR_SPREADSHEET_ID',
'Sheet1'
)Batch Updates for Large Files
def import_large_csv(csv_path: str, spreadsheet_id: str, sheet_name: str = 'Sheet1'):
service = get_sheets_service()
BATCH_SIZE = 10000
with open(csv_path, 'r', encoding='utf-8') as f:
reader = csv.reader(f)
rows = list(reader)
total_cells = 0
for i in range(0, len(rows), BATCH_SIZE):
batch = rows[i:i + BATCH_SIZE]
start_row = i + 1
service.spreadsheets().values().update(
spreadsheetId=spreadsheet_id,
range=f'{sheet_name}!A{start_row}',
valueInputOption='RAW',
body={'values': batch}
).execute()
total_cells += len(batch) * len(batch[0]) if batch else 0
print(f"Processed rows {i + 1} to {i + len(batch)}")
print(f"Total cells: {total_cells}")Step 4: Apps Script Alternative
Honestly, if your CSVs already live in Google Drive, Apps Script is way simpler than the full API setup:
// In Google Sheets → Extensions → Apps Script
function importCSVFromDrive() {
const folderId = 'YOUR_DRIVE_FOLDER_ID';
const folder = DriveApp.getFolderById(folderId);
const files = folder.getFilesByType(MimeType.CSV);
const sheet = SpreadsheetApp.getActiveSheet();
while (files.hasNext()) {
const file = files.next();
const content = file.getBlob().getDataAsString();
const rows = Utilities.parseCsv(content);
sheet.getRange(1, 1, rows.length, rows[0].length).setValues(rows);
// Move processed file
file.moveTo(DriveApp.getFolderById('PROCESSED_FOLDER_ID'));
}
}
// Set up time-driven trigger in Apps Script UIThis works great for scheduled imports from Drive. For external data, stick with the API approach above.
Step 5: The IMPORTDATA Function
For public CSV URLs, you don't need any code at all:
=IMPORTDATA("https://example.com/data.csv")
Limitations:
- URL must be publicly accessible
- 50 IMPORTDATA functions per spreadsheet
- No authentication support
- No error handling
- Refreshes automatically (can't control timing)
Fine for public datasets. Useless for authenticated APIs or local files.
Common Issues and Fixes
"The caller does not have permission" (403)
You probably forgot to share the spreadsheet with the service account email. Go share it as Editor. This is the #1 mistake I see.
"Quota exceeded"
Google Sheets API allows 300 write requests per minute per project. You'll hit this faster than you think. Add delays between requests:
await new Promise(resolve => setTimeout(resolve, 250)); // 250ms = 240/min max"Invalid range"
Sheet name with spaces needs quotes: 'Sheet Name'!A1
Or use the sheet's GID instead: #gid=0
Character encoding issues
Force UTF-8 when reading:
const csvContent = readFileSync(csvPath, { encoding: 'utf-8' });with open(csv_path, 'r', encoding='utf-8-sig') as f: # handles BOMDates formatted wrong
This is frustrating. Sheets guesses that certain values are dates and reformats them silently. Use RAW valueInputOption (as shown) to prevent this, or format dates explicitly:
valueInputOption: 'USER_ENTERED', // Sheets parses values
// vs
valueInputOption: 'RAW', // Values stored as-isMaking It Production-Ready
Error Handling
The API will fail on you. Wrap calls in try-catch and handle the error codes that actually matter:
try {
await sheets.spreadsheets.values.update({...});
} catch (error: any) {
if (error.code === 429) {
// Rate limited - wait and retry
await new Promise(r => setTimeout(r, 60000));
// Retry...
} else if (error.code === 403) {
console.error('Permission denied - check sheet sharing');
} else {
throw error;
}
}Logging
You'll want this when something breaks at 3am:
console.log({
timestamp: new Date().toISOString(),
spreadsheetId,
rowsImported: data.length,
columnsImported: data[0]?.length || 0,
});Scheduling
For daily imports, pick whatever you already use:
- Node.js / Python: cron jobs, GitHub Actions, or any cloud scheduler
- Apps Script: Built-in time-driven triggers (honestly the easiest option if you're already in Google's world)
When This Approach Falls Short
This works well for dev-to-Sheets pipelines. It falls apart in other scenarios though.
End users need to upload CSVs. Building a file upload UI, column mapping, and validation? That's a whole separate project. I haven't even covered the edge cases around user-uploaded files -- encoding issues, missing headers, extra commas. It gets messy fast.
CSV formats keep changing. Different column names, missing fields, inconsistent formatting. The Sheets API doesn't help here at all.
You need validation before import. The API just writes data. Bad rows go right in alongside good ones. No warnings.
For user-facing imports with validation and column mapping, ImportCSV handles that complexity. You get clean, validated data via webhook -- no upload UI to build.
FAQ
Can I import to a specific sheet tab?
Yes. Use the sheet name in the range: 'Monthly Report'!A1 or the GID: #gid=123456.
How many cells can I update at once?
The API allows up to 10 million cells per spreadsheet total, and 100 requests per batch. In practice, batch 10,000-50,000 rows at a time. Beyond that, things start getting unreliable in my experience.
Can I append instead of overwrite?
Use spreadsheets.values.append instead of update:
await sheets.spreadsheets.values.append({
spreadsheetId,
range: 'Sheet1!A1',
valueInputOption: 'RAW',
requestBody: { values: newRows },
});Does this work with Google Workspace accounts?
Yes, but your org might restrict API access. If you're getting weird permission errors, check with your Workspace admin before debugging further -- it could save you an hour of frustration.
The Sheets API setup is more involved than manual imports or IMPORTDATA. No question. But once it's running, you can automate imports from anywhere -- and that's worth the upfront pain.
Wrap-up
CSV imports shouldn't slow you down. ImportCSV aims to expand into your workflow — whether you're building data import flows, handling customer uploads, or processing large datasets.
If that sounds like the kind of tooling you want to use, try ImportCSV .