I'm trying to capture frames from a live video stream (h.264) and pipe the resulting JPG images to a Node JS script, instead of saving these individual frames directly to .jpg files.
As a test, I created the following Node JS script, to simply capture the incoming piped data, then dump it to a file:
// pipe.js - test pipe output
var fs = require('fs');
var data = '';
process.stdin.resume();
process.stdin.setEncoding('utf8');
var filename = process.argv[2];
process.stdin.on('data', (chunk) => {
console.log('Received data chunk via pipe.');
data += chunk;
});
process.stdin.on('end', () => {
console.log('Data ended.');
fs.writeFile(filename, data, err => {
if (err) {
console.log('Error writing file: error #', err);
}
});
console.log('Saved file.');
});
console.log('Started... Filename = ' + filename);
Here's the ffmpeg command I used:
ffmpeg -vcodec h264_mmal -i "rtsp://[stream url]" -vframes 1 -f image2pipe - | node pipe.js test.jpg
This generated the following output, and also produced a 175kB file which contains garbage (unreadable as a jpg file anyway). FYI using ffmpeg to export directly to a jpg file produced files around 25kB in size.
...
Press [q] to stop, [?] for help
[h264_mmal @ 0x130d3f0] Changing output format.
Input stream #0:0 frame changed from size:1280x720 fmt:yuvj420p to size:1280x720 fmt:yuv420p
[swscaler @ 0x1450ca0] deprecated pixel format used, make sure you did set range correctly
Received data chunk via pipe.
Received data chunk via pipe.
frame= 1 fps=0.0 q=7.7 Lsize= 94kB time=00:00:00.40 bitrate=1929.0kbits/s speed=1.18x
video:94kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000000%
Received data chunk via pipe.
Data ended.
Saved file.
You can see that the Node JS script is receiving piped data (per the "Received data via pipe" messages above. However, it doesn't seem to be outputting a valid JPG file. I can't find a way to specifically request that ffmpeg output JPG format, since there is no -vcodec option for JPG. I tried using -vcodec png and outputting to a .png file, but the resulting file was about 2MB in size and also unreadable as a png file.
Is this a problem caused by using utf8 encoding, or am I doing something else wrong?
Thanks for any advice.
UPDATE: OK I got it to send a single jpg image correctly. The issue was in the way Node JS was capturing the stream data. Here's a working script:
// pipe.js - capture piped binary input and write to file
var fs = require('fs');
var filename = process.argv[2];
console.log("Opening " + filename + " for binary writing...");
var wstream = fs.createWriteStream(filename);
process.stdin.on('readable', () => {
var chunk = '';
while ((chunk = process.stdin.read()) !== null) {
wstream.write(chunk); // Write the binary data to file
console.log("Writing chunk to file...");
}
});
process.stdin.on('end', () => {
// Close the file
wstream.end();
});
However, now the problem is this: when piping the output of ffmpeg to this script, how can I tell when one JPG file ends and another one begins?
ffmpeg command:
ffmpeg -vcodec h264_mmal -i "[my rtsp stream]" -r 1 -q:v 2 -f singlejpeg - | node pipe.js test_output.jpg
The test_output.jpg file continues to grow as long as the script runs. How can I instead know when the data for one jpg is complete and another one has started? According to this, jpeg files always start with FF D8 FF and end with FF D9, so I guess I can check for this ending signature and start a new file at that point... any other suggestions?