32

I want to check if the URL of a large file exists. I'm using the code below but it is too slow:

public static bool TryGet(string url)
{
    try
    {
        GetHttpResponseHeaders(url);
        return true;
    }
    catch (WebException)
    {
    }

    return false;
}

public static Dictionary<string, string> GetHttpResponseHeaders(string url)
{
    Dictionary<string, string> headers = new Dictionary<string, string>();
    WebRequest webRequest = HttpWebRequest.Create(url);
    using (WebResponse webResponse = webRequest.GetResponse())
    {
        foreach (string header in webResponse.Headers)
        {
            headers.Add(header, webResponse.Headers[header]);
        }
    }

    return headers;
}
Endy Tjahjono
  • 24,120
  • 23
  • 83
  • 123
Jader Dias
  • 88,211
  • 155
  • 421
  • 625
  • Do you really have to loop through all the headers? – DOK Jun 04 '11 at 15:52
  • 5
    @DOK Nope, but I doubt the headers are responsible this performance hit – Jader Dias Jun 04 '11 at 15:53
  • This may not be the problem either, but I seen in [MSDN}(http://msdn.microsoft.com/en-us/library/system.net.httpwebresponse.aspx) that You must call either the Stream.Close or the HttpWebResponse.Close method to close the response and release the connection for reuse. – DOK Jun 04 '11 at 15:56
  • 4
    @DOK I believe the `using` directive already does that for me. – Jader Dias Jun 04 '11 at 15:59

1 Answers1

52

You need to set:

webRequest.Method = "HEAD";

This way the server will respond with the header information only (no content). This is also useful to check if the server accepts certain operations (i.e. compressed data etc.).

Teoman Soygul
  • 25,584
  • 6
  • 69
  • 80