1

I've seen so many questions about scraping html with Jquery + node.js + YQL. It makes no mention of getting the css and javascript from the webpage.

Is there any way to get the html, css and javascript of an external website without using server side techniques?

*I need this to happen in code so I can use the results in a webapp.

HoldOffHunger
  • 18,769
  • 10
  • 104
  • 133
ceptno
  • 687
  • 2
  • 6
  • 28

1 Answers1

0

Possible Duplicate of this. Your answer lies here.

You should check out jQuery. It has a rich base of AJAX functionality that can give you the power to do all of this. You can load in an external page, and parse it's HTML content with intuitive CSS-like selectors.

An example using $.get();

$.get("anotherPage.html", {}, function(results){
  alert(results); // will show the HTML from anotherPage.html
  alert($(results).find("div.scores").html()); // show "scores" div in results
});

For external domains I've had to author a local PHP script that will act as a middle-man. jQuery will call the local PHP script passing in another server's URL as an argument, the local PHP script will gather the data, and jQuery will read the data from the local PHP script.

$.get("middleman.php", {"site":"http://www.google.com"}, function(results){
  alert(results); // middleman gives Google's HTML to jQuery
});

Note: PHP thing is for different domains.

Community
  • 1
  • 1
Bhushan Firake
  • 9,338
  • 5
  • 44
  • 79
  • sorry for not specifying. I want to retrieve the info on an external site with only javascript. – ceptno Feb 12 '13 at 06:26