1

I'm dockerizing my Python-Selenium application, and have this three lines in my Dockerfile:

RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
RUN sh -c 'echo "deb http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list'
RUN apt-get update -qqy --no-install-recommends && 

Following this answer, I first launched a standalone chrome browser with this command:

docker run -d -p 4444:4444 selenium/standalone-chrome

Added this to docker-compose.yml:

  selenium:
    image: selenium/standalone-firefox
    ports:
    - 4444:4444

And edited my scrap function to the following:

def scrap_function(url):
    chrome_options = Options()
    chrome_options.add_argument("--headless")
    chrome_options.add_argument("--no-sandbox")
    chrome_options.add_argument("--disable-dev-shm-usage")
    chrome_prefs = {}
    chrome_options.experimental_options["prefs"] = chrome_prefs
    chrome_prefs["profile.default_content_settings"] = {"images": 2}
    driver = webdriver.Remote("http://172.18.0.3:4444/wd/hub", options=chrome_options)

I got the remote url from the log, and after these changes running docker-compose build and docker-compose up. Seems it is going to the function correctly, but it isn't returning any results nor error messages. Is the webdriver configuration okay? Seems it is going there but the function isn't returning any values.

rolandist_scim
  • 145
  • 1
  • 8

0 Answers0