I have a file named 'images.txt' which contains urls to images like this
http://www.foobar.com/assets/foobar/abc.jpg
http://www.foobar.com/assets/foobar/xyz.jpg
http://www.foobar.com/assets/foobar/aaa.jpg
http://www.foobar.com/assets/foobar/bbb.jpg
http://www.foobar.com/assets/foobar/rrr.jpg
I am trying to iterate through the files and download the images one by one, but for some reason I can only download the last image rrr.jpg
and it only has 0kb ..which is incorrect.
I am using the below nodejs code.
var wget = require('node-wget'); fs = require('fs')
fs.readFile(__dirname + '\\images.txt', 'utf8', function (err,data) {
if (err) {
return console.log(err);
}else{
wget(data);
}
});
I have both nodejs and wget installed and working without any issue.
UPDATE
I have used this script
var lineReader = require('readline').createInterface({
input: require('fs').createReadStream(__dirname + '\\images.txt')
});
lineReader.on('line', function (line) {
wget(line);
});
It downloads all the images but they seem to be corrupted as all of them can not be opened or deleted. Even though nodejs finished running the script it seems there is something open from nodejs but not closed