I've been trying to scrape some raw XML data from an internal company site (url excluded for security purposes). I am currently using selenium and beautifulsoup to do so (but am open to any other options). When accessing the site manually, I am prompted with a javascript browser alert for a username and password (see picture). My attempt to automatically validate credentials is below (does not pass authentication):
def main():
#gets specified list of direct reports
# username:password@
url ="http://{username}:{password}@myURL.com"
driver.get(url)
html = driver.page_source
soup = BeautifulSoup(html, "lxml")
# parsing logic follows ...
However, when the script runs I still have to manually enter the username and password in the browsing window controlled by chromedriver and then the rest of the program runs as expected..
Is there a way avoid this manually entry? I've also tried solutions around driver.alert and sending keys & credentials to the browser to no avail.. (I know this may be difficult because the site is not accessible outside of the network, any insight is appreciated!)
Edit: I should mention this method was working a couple weeks ago, but following a chrome update no longer does..