So I have a question; How does one get the files from a webpage and the urls attached to them. For example, Google.com
so we go to google.com and open firebug (Mozilla/chrome) and go to the "network" We then see the location of every file attached, and extension of the file.
How do I do this in python?
For url stuff, I usually look into urllib/mechanize/selenium but none of these seem to support what I want or I don't know the code that would be associated with it.
I'm using linux python 2.7 - Any help/answers would be awesome. Thank you for anyone attempting to answer this.
Edit: The things the back end servers generate, I don't know how but firebug in the "net" or "network" section show this information. I wondered if it could be implemented into python some how.