3

I need to read a very big location history file and extract some data and write to a file as JSON data. How can i do that. The following code doesn't generate any output. Edit: I expect to string output in the file, because it's piped into fileOutputStream

const fs = require('fs')
var JSONStream = require('JSONStream');
var es = require('event-stream');
const filePath = './location-history.json'
const fileOutputPath = './transform-location-history.json'

fileStream = fs.createReadStream(filePath);
fileOutputStream = fs.createWriteStream(fileOutputPath)

const  transformer = (data) => {
  const location = {
        latitude: data.latitudeE7 / 10000000,
        longitude: data.longitudeE7 / 10000000
    }
  return JSON.stringify(location);
}

fileStream
.pipe(JSONStream.parse('locations.*'))
.pipe(es.through(transformer))
.pipe(fileOutputStream)
att
  • 617
  • 8
  • 17
  • Possible duplicate of [Parse large JSON file in Nodejs](https://stackoverflow.com/questions/11874096/parse-large-json-file-in-nodejs) – Veve Aug 16 '17 at 11:58
  • Why it doesn't write to output stream? – att Aug 16 '17 at 12:08
  • Everything looks fine.Will you mention from where your are getting the value into data variable – Syed Ayesha Bebe Aug 16 '17 at 12:38
  • https://gist.github.com/tuncatunc/d4670b73032d4473a7236ac95514c3d0 sample input. The original file has more than 200K points. – att Aug 17 '17 at 04:51

1 Answers1

2

This is my solution the my problem. JSONStream parses the input file and spits JSON objects. The es.through(transformer) takes the JSON object and writes it to the file as string. To make file output file to be importable in ES6, 'export default locationHistory' is added. https://gist.github.com/tuncatunc/35e5449905159928e718d82c06bc66da

const fs = require('fs')
const JSONStream = require('JSONStream');
var es = require('event-stream');
const filePath = './location-history.json'
const fileOutputPath = './transform-location-history.js'

const fileStream = fs.createReadStream(filePath);
const fileOutputStream = fs.createWriteStream(fileOutputPath)

let index = 0;
const  transformer = (data) => {
  const location = {
        latitude: data.latitudeE7 / 10000000,
        longitude: data.longitudeE7 / 10000000
    };
  let result = JSON.stringify(location) + ',';
  if (index === 0) {
    result = 'const locationHistory = [' + result
  }
  index++;
  if (index < 100)
    fileOutputStream.write(result);
}

const end = () => {
  const finish = ']; export default locationHistory\n'
  fileOutputStream.write(finish, () => {
    fileOutputStream.close()
  })
  console.log(`${index} objects are written to file`)
}

fileStream
.pipe(JSONStream.parse('locations.*'))
.pipe(es.through(transformer, end))
att
  • 617
  • 8
  • 17