If I have a file like below:
key1 key2
data1 data2
data3 data4
...
Is there an easy way to read this file and convert it to a json object array? The final output I want is :
[{key1:data1, key2: data2},{key1: data3, key2: data4}, ...]
If I have a file like below:
key1 key2
data1 data2
data3 data4
...
Is there an easy way to read this file and convert it to a json object array? The final output I want is :
[{key1:data1, key2: data2},{key1: data3, key2: data4}, ...]
import fs from 'fs';
// read the file and split into rows and cells
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
// get the first row with cells as key names
const head = data.shift();
// map the rest of the rows into objects with keys from the first row
const result = data.map(row => Object.fromEntries(row.map((cell, idx) => [head[idx], cell])));
console.log(JSON.stringify(result));
The result:
/usr/local/bin/node ./read-csv.mjs
[{"key1":"data1","key2":"data2"},{"key1":"data3","key2":"data4"}]
If you need some speed boost with a big file, use Array::reduce()
:
import fs from 'fs';
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
const head = data.shift();
const result = data.map(row => row.reduce((obj, cell, idx) => {
obj[head[idx]] = cell;
return obj;
}, {}));
console.log(JSON.stringify(result));
If you need the maximum speed use for
loops:
import fs from 'fs';
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
const head = data.shift();
const result = [];
for (let i = 0; i < data.length; i++) {
const row = data[i];
const item = result[result.length] = {};
for (let j = 0; j < row.length; j++) {
item[head[j]] = row[j];
}
}
console.log(JSON.stringify(result));
Are you using Vanilla Javascript or NodeJS? Node has the package fs
which has got utility functions to perform filesystem operations.
Irrespective of whether you are using Node or Vanilla JS, you could consider the following approach.
Read the file into a variable(not adding the code for that here as the method will vary depending on whether you are using Node or Vanilla JS), say fileContent
let lines = fileContent.split("\n");
const result = [];
lines.forEach(line => {
let dataValues = line.split(" "); //Assuming the values are separated by space
result.push({key1: dataValues[0], key2: dataValues[1]});
});
result
will hold the required output.
Another post that might be useful for you.
Hope this helps.
This python code will do it for you
import json
def convert_file_to_json(filename):
with open(filename, 'r') as file:
lines = file.readlines()
# Extract keys from the first line
keys = lines[0].split()
# Create JSON object array
json_objects = []
for line in lines[1:]:
values = line.split()
json_object = {keys[i]: values[i] for i in range(len(keys))}
json_objects.append(json_object)
return json.dumps(json_objects)
# Usage example
filename = 'your_file.txt'
json_data = convert_file_to_json(filename)
print(json_data)