2

Hey I am a real beginner with node js so bear with me. I am trying to download a file ( an image) this is the code I have :

function downloadFileFromURL( url, callback ) 
{
    file_name = path.basename(url);

    var wstream = fs.createWriteStream(file_name);

    wstream.on('error', function (err) {
        console.log(err, url);
    });

    wstream.on( 'close', function(){

        console.log( "finished downloading: ", url, this.path );

    });

    request(img_url).pipe( wstream );
}

if I parse a blog feed with my app, this function downloads about half of the images. I can see the images fine from a browser. The files get created but some of them stay at 0 bytes.

the example feed I am parsing is: http://feeds.feedburner.com/ButDoesItFloat?format=xml

I saw this question on here: Writing image to local server which is similar and would love to see how its done with node-request

Community
  • 1
  • 1
Thomas Traum
  • 277
  • 5
  • 14
  • sorry if this wasn't clear, my question is why some files get fully downloaded and why some aren't with the node-request module and if anyone has had similar issues. In the meantime I worked a bit more on the code and replaced it with the code here: http://stackoverflow.com/questions/5294470/node-js-writing-image-to-local-server and it works fine now, not using the request module at all. Maybe this is a bug and I should rather ask on github... – Thomas Traum Jul 21 '12 at 15:42

1 Answers1

0

Take a look at http-get to achives it. You just need to pass two parameters, first, the URL, the second is the path where the file will be saved, pretty simple. The callback function returns the filename as "result.file". Check the code below and give a try:

var http = require('http-get');
http.get('www.ohmydear.com/thesite.png', '/path/to/thesite.png', function (error, result) {
    if (error) {
        console.error(error);
    } else {
        console.log('File downloaded at: ' + result.file);
    }
});
Ito
  • 2,167
  • 3
  • 23
  • 32