0

I have a log file with about 14.000 aircraft position datapoints captured from a system called Flarm, it looks like this:

{"addr":"A","time":1531919658.578100,"dist":902.98,"alt":385,"vs":-8}
{"addr":"A","time":1531919658.987861,"dist":914.47,"alt":384,"vs":-7}
{"addr":"A","time":1531919660.217471,"dist":925.26,"alt":383,"vs":-7}
{"addr":"A","time":1531919660.623466,"dist":925.26,"alt":383,"vs":-7}

What I need to do is find a way to 'play' this file back in real-time (as if it were occuring right now, even though it's pre-recorded), and emit an event whenever a log entry 'occurs'. The file is not being added to, it's pre-recorded and the playing back would occur at a later stage.

The reason for doing this is that I don't have access to the receiving equipment when I'm developing.

The only way I can think to do it is to set a timeout for every log entry, but that doesn't seem like the right way to do it. Also, this process would have to scale to longer recordings (this one was only an hour long).

Are there other ways of doing this?

Phasy
  • 13
  • 3

3 Answers3

1

If you want to "play them back" with the actual time difference, a setTimeout is pretty much what you have to do.

const processEntry = (entry, index) => {
  index++;
  const nextEntry = getEntry(index);
  if (nextEntry == null) return;

  const timeDiff = nextEntry.time - entry.time;
  emitEntryEvent(entry);
  setTimeout(processEntry, timeDiff, nextEntry, index);
};

processEntry(getEntry(0), 0);

This emits the current entry and then sets a timeout based on the difference until the next entry. getEntry could either fetch lines from a prefilled array or fetch lines individually based on the index. In the latter case only two lines of data would only be in memory at the same time.

Lucas S.
  • 2,303
  • 1
  • 14
  • 20
  • as I commented on authors answer, recursive timeouts might be problematic according to [this](https://medium.com/@devinmpierce/recursive-settimeout-8eb953b02b98) is there a better way for large files ? – numan May 10 '22 at 14:44
  • 1
    @numan There is no such problem. Call stacks don't exist across asynchronous calls. What the author of that blog post gets confused by is the _virtual_ stack that modern browser dev tools display (with clear `async` markers) to aid in debugging async code, but those have no connection to the actual stack. – Lucas S. May 15 '22 at 14:19
1

Got it working in the end! setTimeout turned out to be the answer, and combined with the input of Lucas S. this is what I ended up with:

const EventEmitter = require('events');
const fs = require('fs');

const readable = fs.createReadStream("./data/2018-07-18_1509log.json", {
  encoding: 'utf8',
  fd: null
});

function read_next_line() {
  var chunk;
  var line = '';
  // While this is a thing we can do, assign chunk
  while ((chunk = readable.read(1)) !== null) {
    // If chunk is a newline character, return the line
    if (chunk === '\n'){
      return JSON.parse(line);
    } else {
      line += chunk;
    }
  }
  return false;
}

var lines = [];
var nextline;

const processEntry = () => {
  // If lines is empty, read a line
  if (lines.length === 0) lines.push(read_next_line());

  // Quit here if we've reached the last line
  if ((nextline = read_next_line()) == false) return true;

  // Else push the just read line into our array
  lines.push(nextline);

  // Get the time difference in milliseconds
  var delay = Number(lines[1].time - lines[0].time) * 1000;

  // Remove the first line
  lines.shift();

  module.exports.emit('data', lines[0]);

  // Repeat after the calculated delay
  setTimeout(processEntry, delay);
}

var ready_to_start = false;

// When the stream becomes readable, allow starting
readable.on('readable', function() {
  ready_to_start = true;
});


module.exports = new EventEmitter;
module.exports.start = function() {
  if (ready_to_start) processEntry();
  if (!ready_to_start) return false;
}
Phasy
  • 13
  • 3
  • I am not an expert, but according to [this](https://medium.com/@devinmpierce/recursive-settimeout-8eb953b02b98) recursive timeout should be used carefully – numan May 10 '22 at 14:41
  • Lucas S. [said](https://stackoverflow.com/questions/51440217/replay-a-log-file-with-nodejs-as-if-it-were-happening-in-real-time/51444263?noredirect=1#comment127645844_51444263) that it is not a problem – numan May 16 '22 at 06:45
0

Assuming you want to visualize the flight logs, you can use fs watch as below, to watch the log file for changes:

fs.watch('somefile', function (event, filename) {
    console.log('event is: ' + event);
    if (filename) {
        console.log('filename provided: ' + filename);
    } else {
        console.log('filename not provided');
    }
});

Code excerpt is from here. For more information on fs.watch() check out here

Then, for seamless update on frontend, you can setup a Websocket to your server where you watch the log file and send newly added row via that socket to frontend.

After you get the data in frontend you can visualize it there. While I haven't done any flight visualization project before, I've used D3js to visualize other stuff (sound, numerical data, metric analysis and etc.) couple of times and it did the job every time.

0xmtn
  • 2,625
  • 5
  • 27
  • 53
  • I should clarify: I'm not trying to get lines as they are added like you might with tail -F, the log file is fully written and all lines are in there. It's a recording of something that happened in real-time. What I need to do is play back the recording _as if_ it were happening live, thereby emulating a system I don't have access to when developing. Thanks for the link to D3js! I will surely be using that in the next steps :) – Phasy Jul 20 '18 at 11:37
  • Oh gotcha. Then parse the log file, divide into chunks(maybe line by line), send those chunks to browser with `delay = t_nextLine - t_lastLineSent` in between, and finally use D3js to visualize. There is this repo that I found, maybe it'll useful for visualization: https://github.com/koutst/flight-visualization. You can either long-poll each line via browser but this won't give the 'sense' of real time, or you can push those chunks via websocket - this'll give proper sense of realtime. ;) – 0xmtn Jul 23 '18 at 11:14