0

We have some resources which contains links to external sites. However, we want to avoid dead links and have implemented a ping routine written in c# .net6.

We loop through all links and do a HEAD and a GET request with HttpClient. Most sites return OK200 but some return bad request, forbidden and so forth. But if we inspect the link in the browser, the site/link works as expected.

If we get a 404 we mark the link as dead and someone should do something manually and update the link. We have added a useragent to httpclient.

How can we avoid the bad requests returned to the httpclient?

MD. RAKIB HASAN
  • 3,670
  • 4
  • 22
  • 35
Bjarke
  • 1,283
  • 11
  • 36
  • It is possible that they detect the user-agent and can detect that this request doesn't come from a browser. Check this question https://stackoverflow.com/questions/19715970/remote-server-403-forbidden-error-while-using-webclient where there was a similar issue – Tasos K. May 06 '22 at 06:29
  • we have added a useragent but it was a custom string. I will give it a go and try a more browser specific useragent – Bjarke May 06 '22 at 07:38

0 Answers0