0

I am developing an application in which I am reading data from xml feed.The xml feed contains large amount of data, which near about 100MB. So while reading data from feed session time out occurs in between.

Can any one suggest me how could I avoid time out.

I have also tried extending execution time out and request length but still the issue is not resolved.

 <httpRuntime  executionTimeout="100000000" maxRequestLength="2097151"
     useFullyQualifiedRedirectUrl="false" minFreeThreads="8"
     minLocalRequestFreeThreads="4" appRequestQueueLimit="100" 
     enableVersionHeader="true" />

Code to read xml data from URL:

WebRequest wrGETURL;
        wrGETURL = WebRequest.Create(sUrl);

        HttpWebResponse wr = (HttpWebResponse)wrGETURL.GetResponse();
        StringBuilder sb = new StringBuilder();
        byte[] buf = new byte[8192];

        if (wr.StatusCode == HttpStatusCode.OK)
        {

            Stream resStream = wr.GetResponseStream();
            string tempString = null;
            int count = 0;
            do
            {
                count = resStream.Read(buf, 0, buf.Length);
                if (count != 0)
                {
                    tempString = Encoding.ASCII.GetString(buf, 0, count);
                    sb.Append(tempString);
                }
            }
         }
R.D.
  • 7,153
  • 7
  • 22
  • 26
  • 1
    What *method* is used to read XML file? Are you *uploading* a file? – KV Prajapati Sep 03 '12 at 04:58
  • @AVD : i edit post and showed method i am using to read data from feed. – R.D. Sep 03 '12 at 05:06
  • Take a look at SO threads [1](http://stackoverflow.com/questions/676274/what-is-the-best-way-to-parse-big-xml-in-c-sharp-code), [2](http://stackoverflow.com/questions/468948/in-c-sharp-what-is-the-best-way-to-parse-large-xml-size-of-1gb), [3](http://stackoverflow.com/questions/55828/best-practices-to-parse-xml-files),[4](http://stackoverflow.com/questions/7671958/reading-large-xml-documents-in-net) – KV Prajapati Sep 03 '12 at 05:22

2 Answers2

0

You can prevent some of the latency by having a local copy. How often does the feed change? There is no use hitting the XML feed often if the underlying data rarely changes. You can have a scheduled process to "freshen" the local copy of the data. If you are trying to download 100 MB each time the page is hit, well, you are experiencing first hand the problems with that scenario. What do you do with the data once you have it? Perhaps you can run a XPath or LINQ to XML query on the data to present just that part you want. Some food for thought...

DaveB
  • 9,470
  • 4
  • 39
  • 66
0

I don't know your exact requirements, but as @DaveB said - if you have to pull 100MB then you have to wait for 100MB.

Is the feed to be read be a user? If so you could pull the xml in parts using ajax - so get 5MB worth and display it, then get the next 5MB and so on. Or perhaps a Twitter style ajax call based on the scroll bar (if that's how it's done) would work.

If the feed is to used in some processing capacity and not read by a user, then don't use a UI thread - get WCF or a WEB API to retrieve the 100MB in the background.

From your post it's hard to tell the requirements, but perhaps these ideas might help?

Neil Thompson
  • 6,356
  • 2
  • 30
  • 53