I have modified a large file to give me just the urls that I want to run in powershell to check the status code and return them. I am new to powershell so I am having a few issues with my code. The file could have duplicate urls also.
I have this working when I just insert a single url. I have tried a few different things foreach Object, but I feel like I am close.
$urlArray = Import-Csv -Path "Scripts\test.csv" | Select -ExpandProperty urls
Function Get-WebStatus($url){
foreach ($url in $urlArray) {
# First we create the request.
$HTTP_Request = [System.Net.WebRequest]::Create($url)
# We then get the HTTP code as an integer.
$HTTP_Status = [int]$HTTP_Response.StatusCode
If ($HTTP_Status -eq 200) {
Write-Host "Site is Ok!"
} Else {
Write-Host $url + $HTTP_Request
}
}
}
I just need a list of sites that don't work. Best case it is saved to its own txt file.
Not working example: google.com 404