I'm trying to save the html source of a fetched web page into a file and I want to do this using only the CMD prompt on Windows.
I could do this with wget, but that requires using the wget program.
Is there any easy way to do this?
I'm trying to save the html source of a fetched web page into a file and I want to do this using only the CMD prompt on Windows.
I could do this with wget, but that requires using the wget program.
Is there any easy way to do this?
You may use a Batch-JScript hybrid script to do that in a very easy way. The JScript support is installed with all Windows versions from XP on.
@if (@CodeSection == @Batch) @then
@echo off
CScript //nologo //E:JScript "%~F0" "%~1" > "%~N1.html"
goto :EOF
@end
var http = WScript.CreateObject('Msxml2.XMLHTTP.6.0');
http.open("GET", WScript.Arguments.Item(0), false);
http.send();
if (http.status == 200) {
WScript.StdOut.Write(http.responseText);
} else {
WScript.StdOut.WriteLine("Error: Status "+http.status+" returned on download.");
}
Save previous code in a .BAT file, for example: GetHtmlCode.bat, and execute it with the web address in the first parameter:
GetHtmlCode.bat "https://stackoverflow.com/questions/24975698/get-code-html-to-a-file-whit-cmd"
I borrowed previous code from the accepted answer of this question and just did a couple small changes.
I'm 99% sure that there is no CMD internal command to achieve this. This means looking to other utilities installed with the OS. Several options there.
Have a look at this Stack Overfow question: Windows batch file file download from a URL
You could possibly use BITS or VBSCRIPT. Both built into XP.
Find more BITS examples here.
If you cant use something built in, another possibility is to use AutoIt.
It's been a while since I used AutoIt, but it is quite powerful. You can write your script, compile to a standalone .EXE and then deploy. No runtime is required.
It is also possible to include additional files in the compiled EXE. So you could deploy WGET as part of the package.
Can you use powershell?
iex ((new-object net.webclient).DownloadString('http://example.com/page')) | out-file -FilePath page.html
You can download the file from internet using windows Power Shell using the following
$webClient = new-object System.Net.WebClient
$webClient.DownloadFile( $url, $path )
As of the Windows 10 May 2019 update (1903) you can use curl to get the html from the site and store it as a HTML file like this:
curl <your link> -o <filename>.html