0

If I have a file like below:

key1 key2
data1 data2
data3 data4
...

Is there an easy way to read this file and convert it to a json object array? The final output I want is :

[{key1:data1, key2: data2},{key1: data3, key2: data4}, ...]
Kerwen
  • 516
  • 5
  • 22
  • 1
    There is no such thing as a "json array": build a normal JS array, and then if you need the JSON string representation of that, you juse use`JSON.stringify`. So your question is about how to turn that data into JS, not into JSON (the JSON part is essentially an irrelevant last step =). So: what have you tried already to load in that data, and then turn that into a normal array of plain JS objects? – Mike 'Pomax' Kamermans Jul 10 '23 at 03:15
  • This looks like a kind of space separated values file. There are many parsers out there (most CSV ones offer a `delimiter` option). If you are not the one producing these files, I'd highly recommend you to use such a parser. Files in the wild come with a lot of weirdnesses that are hard to handle, e.g. for basic ones, could `data1` itself contain a space? Is it supposed to be escaped with a `\` character or is the whole value supposed to be inside `"` quotes? Same for new line characters, and btw. which new line feed are you expecting? LF? CRLF? a mix? And I don't even touch encodings... – Kaiido Jul 10 '23 at 05:34

3 Answers3

1
import fs from 'fs';

// read the file and split into rows and cells
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
// get the first row with cells as key names
const head = data.shift();

// map the rest of the rows into objects with keys from the first row
const result = data.map(row => Object.fromEntries(row.map((cell, idx) => [head[idx], cell])));
console.log(JSON.stringify(result));

The result:

/usr/local/bin/node ./read-csv.mjs
[{"key1":"data1","key2":"data2"},{"key1":"data3","key2":"data4"}]

If you need some speed boost with a big file, use Array::reduce():

import fs from 'fs';

const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
const head = data.shift();

const result = data.map(row => row.reduce((obj, cell, idx) => {
    obj[head[idx]] = cell;
    return obj;
}, {}));
console.log(JSON.stringify(result));

If you need the maximum speed use for loops:

import fs from 'fs';

const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
const head = data.shift();

const result = [];

for (let i = 0; i < data.length; i++) {
    const row = data[i];
    const item = result[result.length] = {};
    for (let j = 0; j < row.length; j++) {
        item[head[j]] = row[j];
    }
}

console.log(JSON.stringify(result));
Alexander Nenashev
  • 8,775
  • 2
  • 6
  • 17
0

Are you using Vanilla Javascript or NodeJS? Node has the package fs which has got utility functions to perform filesystem operations.

Irrespective of whether you are using Node or Vanilla JS, you could consider the following approach.

Read the file into a variable(not adding the code for that here as the method will vary depending on whether you are using Node or Vanilla JS), say fileContent

let lines = fileContent.split("\n");

const result = [];

lines.forEach(line => {
  let dataValues = line.split(" "); //Assuming the values are separated by space
  result.push({key1: dataValues[0], key2: dataValues[1]});
});

result will hold the required output.

Another post that might be useful for you.

Hope this helps.

CodeBird
  • 387
  • 1
  • 8
-1

This python code will do it for you

import json

def convert_file_to_json(filename):
    with open(filename, 'r') as file:
        lines = file.readlines()
    
    # Extract keys from the first line
    keys = lines[0].split()

    # Create JSON object array
    json_objects = []
    for line in lines[1:]:
        values = line.split()
        json_object = {keys[i]: values[i] for i in range(len(keys))}
        json_objects.append(json_object)

    return json.dumps(json_objects)

# Usage example
filename = 'your_file.txt'
json_data = convert_file_to_json(filename)
print(json_data)