How to Convert CSV to JSON in Node.js

In this task, we will learn how to convert CSV to JSON using Node.js. CSV (Comma-Separated Values) and JSON (JavaScript Object Notation) are two of the most commonly used data formats in backend development.

  • CSV is a simple, plain-text format used for storing tabular data, making it the standard for spreadsheet exports and database dumps.
  • JSON is a structured data format widely used in REST APIs, web applications, and NoSQL databases because it is incredibly easy for both humans and machines to read and parse.

Converting CSV to JSON is a frequent and essential task for backend developers. Many external systems (like legacy databases or third-party vendors) provide data in CSV format, while most modern JavaScript applications require JSON for processing, API responses, or database storage.

In this tutorial, we will learn two different methods to convert CSV into JSON using Node.js:

  1. Manual method using built-in Node modules.
  2. Using an NPM library (highly recommended for real-world projects).

Prerequisites

Before starting, make sure you have the following ready :

  • Node.js is installed on your system.
  • A code editor (VS Code is highly recommended).

To check if Node.js and NPM (Node Package Manager) are installed, open your terminal and run :

node -v
npm -v

Create a Sample CSV File

To test our code, let’s create some dummy data. In your project folder, create a file named data.csv and add the following content.

name,age,city
Chaitanya,21,Pune
Sahil,25,Mumbai
Rahul,32,Delhi

Approach 1 – Using Native Node.js (fs module)

This method converts a CSV file manually using only the built-in Node.js tools. It is an excellent way to understand how file parsing works under the hood.

Step 1 – Create the Script File

Create a new file in your project folder named convert.js. We will write our manual parsing logic here.

Step 2 – Read the CSV File

First, we need to import the File System (fs) module to read our data.csv file.

const fs = require("fs");"utf
const csvData = fs.readFileSync("data.csv", "utf8");

fs.readFileSync reads the entire CSV file synchronously and stores it in the csvData variable as a giant string.

Step 3 – Split Rows Using .split()

Next, we need to break that giant string into individual rows.

const lines = csvData.split("\n");

How .split() Works :

The .split(“\n”) function breaks the text into an array wherever a new line character (\n) occurs.
For example, a string like “name, age, city\nJohn, 25, NY” becomes a string:

[
"name,age,city",
"John,25,NY"
]

Step 4 – Extract Headers

The first row of a CSV file always contains the column names (headers). We need to extract these so we can use them as keys for our JSON objects.

const headers = lines[0].split(",");

Resulting array : [“name”, “age”, “city”]

Step 5 – Convert Rows to JSON Objects

Now we will loop through the remaining rows, match the values to the headers, and push them into an array.

const result = [];

// Start loop at index 1 to skip the header row
for (let i = 1; i < lines.length; i++) {
    // Skip empty lines to prevent errors
    if (!lines[i]) continue;

    const values = lines[i].split(",");
    const obj = {};

    // Match each header with its corresponding value
    headers.forEach((header, index) => {
        obj[header.trim()] = values[index].trim();
    });

    result.push(obj);
}

Explanation :

  1. Skip first row: We start the loop at let i = 1 because the index 0 is the header row.
  2. Split by comma: We split the current row’s string into an array of individual values.
  3. Match header to value: We use a forEach loop to assign the value to the correct header key inside an empty object, obj. We use .trim() to remove any accidental white spaces.
  4. Push into array: We save the completed object into our result array.

Step 6 – Save the JSON File

Finally, we write the JavaScript array to a new JSON file.

fs.writeFileSync("output.json", JSON.stringify(result, null, 2));
console.log("JSON file created!");

JSON.stringify(result, null, 2) converts our JavaScript array into a beautifully formatted JSON string with a 2-space indentation.

Step 7 – Complete Code

const fs = require('fs');

function csvToJson(csvFilePath) {
    // 1. Read the CSV file synchronously
    const csvData = fs.readFileSync(csvFilePath, 'utf-8');

    // 2. Split the data into an array of rows
    const rows = csvData.split('\n');

    // 3. Extract the headers (first row) and split them by comma
    const headers = rows[0].split(',');

    const jsonArray = [];

    // 4. Loop through the remaining rows
    for (let i = 1; i < rows.length; i++) {
        const rowData = rows[i];
        
        // Skip empty lines
        if (!rowData) continue;

        const values = rowData.split(',');
        const obj = {};

        // 5. Map each value to its corresponding header
        for (let j = 0; j < headers.length; j++) {
            // Trim whitespace just in case
            obj[headers[j].trim()] = values[j].trim();
        }

        jsonArray.push(obj);
    }

    return jsonArray;
}

const result = csvToJson('data.csv');

// Write the result to a new JSON file
fs.writeFileSync('output-native.json', JSON.stringify(result, null, 4));
console.log('Successfully converted CSV to JSON using native fs!');

 

Run your script:

node convert.js

Approach 2 – Using csvtojson NPM Package (Recommended)

While manual parsing is great for learning, it fails easily in real-world scenarios. What happens if a user’s address contains a comma (e.g., “Mumbai, Insia”)? The .split(“,”) method will break!

Libraries solve these edge cases automatically. The csvtojson package handles quotes, massive files, and varying encoding effortlessly.

Install the Package

Initialize your project and install the library via your terminal:

npm init -y
npm install csvtojson

Code Using the Library

Create a new file (e.g., library-convert.js ) and add the following code:

const csv = require('csvtojson');
const fs = require('fs');

const csvFilePath = 'data.csv';

csv()
  .fromFile(csvFilePath)
  .then((jsonObj) => {
      // jsonObj is an array of JSON objects representing the CSV data
      console.log(jsonObj);
      
      // Save it to a file
      fs.writeFileSync('output.json', JSON.stringify(jsonObj, null, 4));
      console.log('Successfully converted using csvtojson!');
  })
  .catch((err) => {
      console.error('Error parsing CSV:', err);
  });

 

That’s it! No manual splitting or looping required.

Why the Library Method is easier

The library automatically :

  • Detects headers automatically.
  • Handles tricky spacing and nested commas within quotes.
  • use Node.js streams, meaning it works perfectly with massively large files without crashing your server’s memory.
  • Prevents common parsing mistakes.

OutPut :

After running either script, your newly generated output.json file will appear in yoyr folder and look exactly like this:

[
  {
   "name": "Chaitanya",
   "age": "21",
   "city": "Pune"
  },
  {
   "name": "Sahil",
   "age": "25",
   "city": "Mumbai"
  },
  {
   "name": "Rahul",
   "age": "32",
   "city": "Delhi"
  }
]

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *