I'm current building a js chrome extension, and to do so, I need to scrap data from some sites.
So, based in this SO question I found that I could achieve that using request with Browserify.
I installed both using npm
and created a browserify.js
snippet to create my bundle.js
file(because for permissions reasons running terminal commands is not working), so I can run Node js
require
's in the client, my browser.
Ok, so I finally managed to create the bundle.js
file and tried to run it in my local server, but it keeps giving me the CORS error and don't return a desired response:
Fetch API cannot load https://somesite/index.html. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8080' is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
One strange thing is that if I run the "unbundled" file direct from terminal using node
:
$ node myFileWithRequires.js
It works as intended, returned the scrapped data.
What am I doing wrong? How can I scrap data in client using request
and browserify
?
CODE:
myBrowserifySnippet.js
var browserify = require('browserify');
var b = browserify();
b.add('myrequest.js');
const fs = require('fs');
const writable = fs.createWriteStream('bundle.js');
b.bundle().pipe(writable);
myFileWithRequires.js
var request = require('request');
request('http://www.google.com', function (error, response, body) {
console.log('error:', error); // Print the error if one occurred
console.log('statusCode:', response && response.statusCode); // Print the response status code if a response was received
console.log('body:', body); // Print the HTML for the Google homepage.
});