Skip to content

iuccio/csvToJson

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

470 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

CSVtoJSON

Node CI CodeQL Maintainability NPM Version NodeJS Version Downloads NPM total downloads Socket Badge

NodeJS Browser Support JavaScript TypeScript

Convert CSV files to JSON with no dependencies. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance. Memory-efficient streaming for processing large files without loading them entirely into memory.

Overview

Transform CSV data into JSON with a simple, chainable API. Choose your implementation style:

  • Synchronous API - Blocking operations for simple workflows
  • Asynchronous API - Promise-based for modern async/await patterns with memory-efficient streaming for large files
  • Browser API - Client-side CSV parsing for web applications

Demo and JSDoc

Features

RFC 4180 Compliant - Proper handling of quoted fields, delimiters, newlines, and escape sequences
Zero Dependencies - No external packages required
Full TypeScript Support - Included type definitions for all APIs
Flexible Configuration - Custom delimiters, encoding, trimming, and more
Method Chaining - Fluent API for readable code
Memory-Efficient Streaming - Process large files without loading them entirely into memory
Comprehensive Error Handling - Detailed, actionable error messages with solutions (see ERROR_HANDLING.md)

RFC 4180 Standard

RFC 4180 is the IETF standard specification for CSV (Comma-Separated Values) files. This library is fully compliant with RFC 4180, ensuring proper handling of:

Aspect RFC 4180 Specification
Default Delimiter Comma (,)
Record Delimiter CRLF (\r\n) or LF (\n)
Quote Character Double-quote (")
Quote Escaping Double quotes ("")

RFC 4180 Example

firstName,lastName,email
"Smith, John",Smith,john@example.com
Jane,Doe,jane@example.com
"Cooper, Andy",Cooper,andy@company.com

Note the quoted fields containing commas are properly handled. See RFC4180_MIGRATION_GUIDE.md for breaking changes and migration details.

Quick Start

Installation

npm install convert-csv-to-json

Synchronous (Simple)

const csvToJson = require('convert-csv-to-json');
const json = csvToJson.getJsonFromCsv('input.csv');

Asynchronous (Modern)

const csvToJson = require('convert-csv-to-json');
const json = await csvToJson.getJsonFromCsvAsync('input.csv');

Browser

const convert = require('convert-csv-to-json');
const json = await convert.browser.parseFile(file);

Documentation

Implementation Use Case Learn More
Sync API Simple, blocking operations Read SYNC.md
Async API Concurrent operations, large files Read ASYNC.md
Browser API Client-side file parsing Read BROWSER.md

Common Tasks

Parse CSV String

const json = csvToJson.csvStringToJson('name,age\nAlice,30');

Custom Delimiter

const json = csvToJson
  .fieldDelimiter(';')
  .getJsonFromCsv('input.csv');

Format Values

const json = csvToJson
  .formatValueByType()
  .getJsonFromCsv('input.csv');
// Converts "30" → 30, "true" → true, etc.

Handle Quoted Fields

const json = csvToJson
  .supportQuotedField(true)
  .getJsonFromCsv('input.csv');

Batch Process Files (Async)

const files = ['file1.csv', 'file2.csv', 'file3.csv'];
const results = await Promise.all(
  files.map(f => csvToJson.getJsonFromCsvAsync(f))
);

Configuration Options

All APIs (Sync, Async and Browser) support the same configuration methods:

  • fieldDelimiter(char) - Set field delimiter (default: ,)
  • formatValueByType() - Auto-convert numbers, booleans
  • supportQuotedField(bool) - Handle quoted fields with embedded delimiters
  • indexHeader(num) - Specify header row (default: 0)
  • trimHeaderFieldWhiteSpace(bool) - Remove spaces from headers
  • parseSubArray(delim, sep) - Parse delimited arrays
  • mapRows(fn) - Transform, filter, or enrich each row
  • getJsonFromStreamAsync(stream) - Process CSV from Readable streams for NodeJS and Browser
  • getJsonFromFileStreamingAsync(filePath) - Stream processing for large files for NodeJS and Browser
  • getJsonFromFileStreamingAsyncWithCallback(filePath, options = {}) - Parse CSV from a File using streaming with progress callbacks for large files
  • utf8Encoding(), latin1Encoding(), etc. - Set file encoding

Examples

fieldDelimiter(char) - Set field delimiter (default: ,)

// Semicolon-delimited
csvToJson.fieldDelimiter(';').getJsonFromCsv('data.csv');

// Tab-delimited
csvToJson.fieldDelimiter('\t').getJsonFromCsv('data.tsv');

// Pipe-delimited
csvToJson.fieldDelimiter('|').getJsonFromCsv('data.psv');

formatValueByType() - Auto-convert numbers, booleans

// Input: name,age,active
//        John,30,true
csvToJson.formatValueByType().getJsonFromCsv('data.csv');
// Output: { name: 'John', age: 30, active: true }

supportQuotedField(bool) - Handle quoted fields with embedded delimiters

// Input: name,description
//        "Smith, John","He said ""Hello"""
csvToJson.supportQuotedField(true).getJsonFromCsv('data.csv');
// Output: { name: 'Smith, John', description: 'He said "Hello"' }

indexHeader(num) - Specify header row (default: 0)

// If headers are in row 2 (3rd line):
csvToJson.indexHeader(2).getJsonFromCsv('data.csv');

trimHeaderFieldWhiteSpace(bool) - Remove spaces from headers

// Input: " First Name ", " Last Name "
csvToJson.trimHeaderFieldWhiteSpace(true).getJsonFromCsv('data.csv');
// Output: { FirstName: 'John', LastName: 'Doe' }

parseSubArray(delim, sep) - Parse delimited arrays

// Input: name,tags
//        John,*javascript,nodejs,typescript*
csvToJson.parseSubArray('*', ',').getJsonFromCsv('data.csv');
// Output: { name: 'John', tags: ['javascript', 'nodejs', 'typescript'] }

mapRows(fn) - Transform, filter, or enrich each row

// Filter out rows that don't match a condition
const result = csvToJson
  .fieldDelimiter(',')
  .mapRows((row) => {
    // Only keep rows where age >= 30
    if (parseInt(row.age) >= 30) {
      return row;
    }
    return null; // Filters out this row
  })
  .getJsonFromCsv('input.csv');

See mapRows Feature - Usage Guide.

utf8Encoding(), latin1Encoding(), etc. - Set file encoding

// UTF-8 encoding
csvToJson.utf8Encoding().getJsonFromCsv('data.csv');

// Latin-1 encoding
csvToJson.latin1Encoding().getJsonFromCsv('data.csv');

// Custom encoding
csvToJson.customEncoding('ucs2').getJsonFromCsv('data.csv');

getJsonFromStreamAsync(stream) - Process CSV from Readable streams

const fs = require('fs');
const csvToJson = require('convert-csv-to-json');

// Process large files without loading them entirely into memory
async function processLargeCSV() {
  const stream = fs.createReadStream('large-dataset.csv');
  const jsonData = await csvToJson
    .fieldDelimiter(';')
    .supportQuotedField(true)
    .getJsonFromStreamAsync(stream);
    
  console.log(`Processed ${jsonData.length} records efficiently`);
  return jsonData;
}

getJsonFromFileStreamingAsync(filePath) - Stream processing for large files

const csvToJson = require('convert-csv-to-json');

// Most efficient way to process large CSV files
async function processLargeCSV(filePath) {
  const jsonData = await csvToJson
    .fieldDelimiter(',')
    .formatValueByType()
    .getJsonFromFileStreamingAsync(filePath);
    
  console.log(`Streamed and processed ${jsonData.length} records`);
  return jsonData;
}

// Usage - handles files of any size without memory constraints
const data = await processLargeCSV('massive-dataset.csv');

getJsonFromFileStreamingAsyncWithCallback(filePath, options = {}) - Parse CSV from a File object using streaming with progress callbacks for large files

const csvToJson = require('convert-csv-to-json');
const fileInput = document.querySelector('#csvfile').files[0];

 csvToJson.browser.getJsonFromFileStreamingAsyncWithCallback(fileInput, {
   chunkSize: 500,
   onChunk: (rows, processed, total) => {
     console.log(`Processed ${processed}/${total} rows`);
     // Handle chunk of rows here
   },
   onComplete: (allRows) => {
     console.log('Processing complete!');
   },
   onError: (error) => {
     console.error('Error:', error);
   }
 });

See SYNC.md, ASYNC.md or BROWSER.md for complete configuration details.

Example: Complete Workflow

const csvToJson = require('convert-csv-to-json');

async function processCSV() {
  const data = await csvToJson
    .fieldDelimiter(',')
    .formatValueByType()
    .supportQuotedField(true)
    .getJsonFromCsvAsync('data.csv');
  
  console.log(`Parsed ${data.length} records`);
  return data;
}

Migration Guides

Development

Install dependencies:

npm install

Run tests:

npm test

Debug tests:

npm run test-debug

CI/CD GitHub Action

See CI/CD GitHub Action.

Release

When pushing to the master branch:

  • Include [MAJOR] in commit message for major release (e.g., v1.0.0 → v2.0.0)
  • Include [PATCH] in commit message for patch release (e.g., v1.0.0 → v1.0.1)
  • Minor release is applied by default (e.g., v1.0.0 → v1.1.0)

License

CSVtoJSON is licensed under the MIT License.


Support

Found a bug or need a feature? Open an issue on GitHub.

Follow me and consider starring the project to show your support ⭐

Buy Me a Coffee

If you find this project helpful and would like to support its development:

BTC: 37vdjQhbaR7k7XzhMKWzMcnqUxfw1njBNk