I'm finishing (QA testing) a web parser built in C# that is parsing specific data from a web site that is being load to a webbrowser control in a WFA (Windows Form Application) program.
The weird behavior is when I'm killing the internet connection... Actually the program is designed to navigate recursively in the site and each step its waiting for a WebBrowserDocumentCompletedEventHandler
to be triggered. Beside that there is a Form timer set,
and if the handler is not triggered in a specific interval then its reloading the entire procedure.
Everything is working good even if I manually avoid the handler from triggering - As I said the timer kicks in and restart the operation successfully and retrying another value successfully.
When shutting the internet connection manually while the procedure is running, I can see the page is getting the internet explorer message: "This page can't be displayed" (For some reason the DocumentComplete... is not triggered). Then immediately reconnecting the internet and waiting for the timer to kick in - As expected it fires the reload function but this time everything is going wild!! the functions are being fired not in the correct order and it seems like there is 100 threads that are running at the same time - a total chaos.
I know that its not easy to answer this question without experiencing that and seeing the code But if I copy the entire code it will be just too long using 5 different classes and I really can't see where is the problem... I'll try to simplify the question:
- why when connection lost the documentcomplete handler don't fires?
- Does anyone has experienced an application going wild only after webbrowser control losses connection?
Thanks