We have some resources which contains links to external sites. However, we want to avoid dead links and have implemented a ping routine written in c# .net6.
We loop through all links and do a HEAD and a GET request with HttpClient. Most sites return OK200 but some return bad request, forbidden and so forth. But if we inspect the link in the browser, the site/link works as expected.
If we get a 404 we mark the link as dead and someone should do something manually and update the link. We have added a useragent to httpclient.
How can we avoid the bad requests returned to the httpclient?