-4

I am currently working on a project where I want to manipulate a website with help of a a small program, preferably in c#. The idea is to go to facebook, xing etc. copy out all messages I received. I can then write an answer without having to open the website and navigating through it. Its more of a programming practice excercise then anything useful.

Now my question: I have programmed someting similar using cursor positions via VBA. As you can imagine, thats very fragile. I'd like to reference the HTML elements directly via their ID. I tried a macro addon (imacros), but that doesn't really meet my requirements. Do you guys have any good ideas?

Thanks ahead!

  • Welcome to SO. Please visit the [help] to see what and how to ask. HINT: Post effort and code – mplungjan Feb 26 '17 at 17:24
  • this question is very broad and therefore not a good fit for SO and @mplungjan gives you a good resource to read, but you probably want to take a look at the APIs available at the sites you want to aggregate - meeting your requirement via HTML parsing will also be a very fragile solution. – lukkea Feb 26 '17 at 17:26
  • Possible dupluicate of http://stackoverflow.com/questions/56107/what-is-the-best-way-to-parse-html-in-c – Thomas Weller Feb 26 '17 at 17:32

1 Answers1

0

the websites you have mentioned all have some form of API - to which extend you will be able to utilise them , I do not know. (Facebook: https://developers.facebook.com/?locale=en_US )

Most modern websites will attempt to make it difficult for you to fiddle around with their products. Facebook, LinkedIn, XING, etc. all utilise security technology and source code parsing is no fun either.

I suggest you play around with:

I cannot post more links, because I have a low reputation level, but generally speaking : HttpClient, WebClient, HttpWebRequest, CookieContainer, Authentication.

Good luck.

wp78de
  • 18,207
  • 7
  • 43
  • 71
Aiolos
  • 28
  • 6