I want to create a tampermonkey script that is registered on one page (call it A
). From this page (it is an overview page), it extracts a series of links (say [B, C, D]
). This is working so far.
Now, I want to do the following:
- Navigate to location
B
. - Wait for the DOM to become ready, so I can extract further information
- Parse some information from the page and store them in some object/array.
- Repeat steps 1 through 3 with the URLs
C
andD
- Go back to address
A
- Copy the content of
out
to the clipboard
The tasks 1 I can achieve by window.open
or window.location
. But I am failing at steps 2 and 3 currently.
Is this even possible? I am unsure if waiting for another page will terminate and unload the current script.
Can you point me into the correct direction to get that issue solved?
If you have any better idea, I am willing to hear them. The reason I am using the browser with tampermonkey is that the pages use some sort of CSRF protection means that will not allow me to use e.g. curl to extract the relevant data.
I have seen this answer. As far as I understand it, this will start a new script on each invocation and I had to pass all information using URL parameters manually. It might be doable (unless the server is messing with the params) but seems to be some effort. Is there a simpler solution?