0

I'm trying to save the html source of a fetched web page into a file and I want to do this using only the CMD prompt on Windows.

I could do this with wget, but that requires using the wget program.

Is there any easy way to do this?

Trevor Reid
  • 3,310
  • 4
  • 27
  • 46
Padawan
  • 51
  • 1
  • 1
  • 6

6 Answers6

4

You may use a Batch-JScript hybrid script to do that in a very easy way. The JScript support is installed with all Windows versions from XP on.

@if (@CodeSection == @Batch) @then

@echo off
CScript //nologo //E:JScript "%~F0" "%~1" > "%~N1.html"
goto :EOF

@end

var http = WScript.CreateObject('Msxml2.XMLHTTP.6.0');
http.open("GET", WScript.Arguments.Item(0), false);
http.send();

if (http.status == 200) {
   WScript.StdOut.Write(http.responseText);
} else {
   WScript.StdOut.WriteLine("Error: Status "+http.status+" returned on download.");
}

Save previous code in a .BAT file, for example: GetHtmlCode.bat, and execute it with the web address in the first parameter:

GetHtmlCode.bat "https://stackoverflow.com/questions/24975698/get-code-html-to-a-file-whit-cmd"

I borrowed previous code from the accepted answer of this question and just did a couple small changes.

Community
  • 1
  • 1
Aacini
  • 65,180
  • 12
  • 72
  • 108
1

I'm 99% sure that there is no CMD internal command to achieve this. This means looking to other utilities installed with the OS. Several options there.

Have a look at this Stack Overfow question: Windows batch file file download from a URL

You could possibly use BITS or VBSCRIPT. Both built into XP.

Find more BITS examples here.

If you cant use something built in, another possibility is to use AutoIt.

It's been a while since I used AutoIt, but it is quite powerful. You can write your script, compile to a standalone .EXE and then deploy. No runtime is required.

It is also possible to include additional files in the compiled EXE. So you could deploy WGET as part of the package.

Trevor Reid
  • 3,310
  • 4
  • 27
  • 46
andyb
  • 2,722
  • 1
  • 18
  • 17
1

With it's as simple as:

xidel.exe -s "<url>" -e "$raw" > output.html
Reino
  • 3,203
  • 1
  • 13
  • 21
0

Can you use powershell?

iex ((new-object net.webclient).DownloadString('http://example.com/page')) | out-file -FilePath page.html
Darren Kopp
  • 76,581
  • 9
  • 79
  • 93
  • it's a good anwser but doesn't work with xp. Thanks. (maybe this is not possible just with CMD) – Padawan Jul 26 '14 at 21:43
  • right, you need to run it in powershell, which I don't remember if that's possible to get on XP or not – Darren Kopp Jul 26 '14 at 22:05
  • Does not work:PS C:\Windows\System32\WindowsPowerShell\v1.0> iex ((new-object net.webclient).DownloadString('http://google.de')) | out -file -FilePath O:\tmp\page.html iex : At line:1 char:1234 + ... ript>(function(){window.google={kEI:'yr3pWcz_NoT76AT27KGgAg',kEXPI:'18167,135282 ... + ~ An expression was expected after '('. At line:1 char:1279 + ... NoT76AT27KGgAg',kEXPI:'18167,1352821,1353383,1354277,1354402,1354915,1355087,135 ... + ~ Missing argument in parameter list. At line:1 char:2623 + ... le.kHL='de';})();(function(){google.lc=[];g... – Denny Weinberg Oct 20 '17 at 09:12
  • @DennyWeinberg seems like that is an issue with ie attempting to execute the scripts in the html rather than this answer. – Darren Kopp Nov 21 '17 at 02:04
0

You can download the file from internet using windows Power Shell using the following

$webClient = new-object System.Net.WebClient
$webClient.DownloadFile( $url, $path )
Trevor Reid
  • 3,310
  • 4
  • 27
  • 46
Hossam Barakat
  • 1,399
  • 9
  • 19
0

As of the Windows 10 May 2019 update (1903) you can use curl to get the html from the site and store it as a HTML file like this:

curl <your link> -o <filename>.html

hugh19se
  • 55
  • 7
  • 1
    Curl is not a CMD internal command and it may not be available in some Windows installations. Maybe you should clarify this. – Ivan Mar 28 '22 at 15:25