13

Is it possible to fully download a website or view all of its code? Like for example I know you can view page source in a browser but is there a way to download all of a websites code like HTML, CSS and JavaScript then run it on my own server or change it up and run that?

user3678528
  • 1,741
  • 2
  • 18
  • 24
Ryan Brienza
  • 191
  • 1
  • 1
  • 3
  • 7
    If it's running in your browser then it *has* been downloaded. – nnnnnn Sep 01 '16 at 02:13
  • 2
    Possible duplicate of [download webpage and dependencies, including css images](http://stackoverflow.com/questions/1581551/download-webpage-and-dependencies-including-css-images) – Makyen Sep 01 '16 at 02:33
  • 1
    A [Google search of your question's title](https://www.google.com/search?q=download+a+websites+entire+code%2C+HTML%2C+CSS+and+JavaScript+files) provides multiple options. – Makyen Sep 01 '16 at 02:34
  • just use ctrl + s then you can chop this page on the website – Horken Sep 01 '16 at 03:32

5 Answers5

14

Hit Ctrl+S and save it as an HTML file (not MHTML). Then, in the <head> tag, add a <base href="http://downloaded_site's_address.com"> tag. For this webpage, for example, it would be <base href="http://stackoverflow.com">.

This makes sure that all relative links point back to where they're supposed to instead of to the folder you saved the HTML file in, so all of the resources (CSS, images, JavaScript, etc.) load correctly instead of leaving you with just HTML. See MDN for more details on the <base> tag.

Michael Kolber
  • 1,309
  • 1
  • 14
  • 23
7

The HTML, CSS and JavaScript are sent to your computer when you ask for them on a HTTP protocol (for instance, when you enter a url on your browser), therefore, you have these parts and could replicate on your own pc or server. But if the website has a server-side code (databases, some type of authentication, etc), you will not have access to it, and therefore, won't be able to replicate on your own pc/server.

Renan Ben Moshe
  • 105
  • 1
  • 2
  • 7
5

Sure. There are tools/scrapers for this, such as SurfOffline and A1 Website Download. I've used both. They'll allow you to scrape a URL for all its files, including html/css, etc. Tools like this were invented to view websites while offline, hence the names.

However, just keep in mind that these can only download front-end/display facing files, so they can't download back-end scrips, like PHP files, etc.

5

You can use HTTrack tools to grab all website content and all the entire image, css, html, javascript.

You can download HTTrack here

Muhammad Ibnuh
  • 360
  • 5
  • 14
  • Just tried HTTrack, it sucks: Didn't download all files of the main domain despite there being a lot of links pointing to the files not downloaded. – Sebastian Nielsen Nov 20 '20 at 22:28
3

In Chrome, go to File -> Save Page as.

That will download the entire contents of the page.

nixkuroi
  • 2,259
  • 1
  • 19
  • 25