So!
For a fansite I run I also run a website scraper(/xmleader) that reads information from a secure weblocation of a game. It works perfectly as it is now but I want to make it better and mainly faster.
The 1st problem I faced was how to maintain a session where you can do a ton of requests (like 1 to 10 every 30 seconds) while maintaining logged in. Normal httprequest didn't really worked because the login was secured with a token that must be submitted together with my login information. Now the solution was made as followed: On a Form is just placed a webbrowser control and when the login page was loaded(documentCompleted event) I fill the login information inside the document and simply submit it.
Now I can access all the secure pages I want to BUT not with a HttpWebRequest I placed inside the code. BUT When I placed multiple WebBrowserControls on the same form all them could access the secure part of the site. So that is how I placed 6 of them to do -kind of- parallel requests (for xml and html) to access information in my account quickly.
This works like a charm actually, you nicely see 7 browsers browse away and analyse the domdocument but naturally this creates a lot of overhead since I don't need the images and all the flash etc to load (or the iFrames which cause very annoying multiple documentLoaded events). So I want to login once and be able to request inside the code with HttpWebRequest with the session/cookie information of the webbrowser(or login in some other way).
So how do I do this? Is this even possible or should I approach it completely differently ?
(ps I write everything in C#)