3

I want to make a python selenium geckodriver bot which should not get detected by firefox, as a group project. I was successful in making a selenium python script using chromedriver which goes undetected on Chrome.

There was a parameter ('excludeSwitches', ['enable-automation']) added using chrome_options.add_experimental_option() function. If this single line is commented out then chrome detects the selenium bot and there is a notification on top of the browser 'Chrome is being controlled by automated test software'.

I checked geckodriver documentation and found that geckodriver does not have the function add_experimental_option(). The parameter ('excludeSwitches', ['enable-automation']) is the line which makes the bot undetectable from the browser. The function add_argument() takes only 1 argument and passing ('excludeSwitches', ['enable-automation']) gives an error "add_argument() takes 2 positional arguments but 3 were given".

Can anyone help me add this parameter so that i can make geckodriver undetectable. I have attached my sample code.

Also, let me know if I am doing anything wrong or unnecessary.

from selenium import webdriver
from selenium.webdriver.firefox.options import Options
profile = webdriver.FirefoxProfile()
import time
import random

def delayed_input(text, query):
    for letter in text:
        query.send_keys(letter)

        ran = random.uniform(0.01,1)
        time.sleep(ran)  # sleep between 10 milliseconds and 1 second
    query.submit()

options = Options()
options.add_argument('--no-sandbox')
options.add_argument('--disable-dev-shm-usage')
options.add_argument('--user-agent=Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.5 Safari/605.1.15')
options.add_argument('--disable-plugins-discovery')
options.add_argument('referer=https://www.google.com/')
options.add_argument('excludeSwitches', ['enable-automation'])
options.add_argument('--disable-extensions')
options.add_argument('--profile-directory=Default')
options.add_argument('--disable-blink-featuresi=AutomationControlled')

profile.set_preference('excludeSwitches', 'enable-automation')
profile.set_preference("dom.webdriver.enabled", False)
profile.set_preference('useAutomationExtension', False)
profile.update_preferences()

d = webdriver.Firefox(firefox_profile=profile, firefox_options=options, executable_path='/usr/local/bin/geckodriver')
options.add_argument('--disable-blink-features')

d.get('https://www.google.com/')
d.set_window_size(1920, 1080)

ran = random.uniform(0.01,1)
time.sleep(ran)  # sleep between 10 milliseconds and 1 second
text = "Manchester United"
query = d.find_element_by_name("q")
delayed_input(text, query)

  • the browser always has knowledge of being controlled by a webdriver. If it didn't "hear" the webdriver's commands it wouldn't know to respond. – pcalkins Mar 03 '20 at 19:18
  • @pcalkins while using chromedriver, I was able to make my bot undetectable using the mentioned parameter on the chrome browser. I want to replicate it on firefox, but geckodriver does not have the function add_experimental_option(). – Ashish Nimonkar Mar 03 '20 at 19:51
  • @AshishNimonkar its not undetectable it just disables the [chrome infobar (read here)](https://peter.sh/experiments/chromium-command-line-switches/#enable-automation) for firefox no such bar appears it shades address bar differently. Now if you want that your bot should be undetectable that seems to be not possible – Dev Mar 03 '20 at 20:00
  • I'm sure it's possible for this site, especially if setting that one flag worked in Chromedriver... and any site really, but this is a moving target. You may want to just ask the site owner if you can run your bot on their site or if they have an API available. – pcalkins Mar 03 '20 at 20:43
  • @Dev So basically, you are saying that my browser will know that it is being controlled by a bot. But, will my bot be able to deceive the website i am trying to visit without the above mentioned flag? – Ashish Nimonkar Mar 04 '20 at 21:36
  • @AshishNimonkar Thats where captcha comes into picture and this is specific to websites not related to browsers. captcha's made for to stop bot/robot being run on there site's or get scrapped – Dev Mar 05 '20 at 05:52

0 Answers0