0

I've found some outdated statistics here:

In Selenium Webdriver which is better in terms of performance Linktext or css?

I'd be curious whether any website publishes details such as this as I wonder whether the gap is closed and using find by text for example is a lot more performant than it used to be?

I guess it would also be useful to see a comparison between different technologies i.e. does Cypress handle this a bit better or using Capybara find by text is oddly non-performant versus others.

shicky
  • 2,076
  • 5
  • 30
  • 44
  • 1
    The fact that different types of searches have different things they can match on is part of what leads to performance differences. If you can locate an element purely by CSS that will generally be the fastest in modern browsers, since they're optimized for CSS processing. However CSS has no text contents matching, so in that case things will generally devolve to XPath. If you do text matching and you're looking for the best performance you should be scoping things to areas of the page using CSS selectors and then using text matching inside those areas. – Thomas Walpole Sep 11 '22 at 23:10

1 Answers1

0

To make some measurements I have used the following simple Selenium code:

import time

from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC


options = Options()
options.add_argument("start-maximized")


webdriver_service = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(service=webdriver_service, options=options)
url = "https://www.bbc.com/"
driver.get(url)
wait = WebDriverWait(driver, 10)
# used this line to start the following measurements when the page is loaded
wait.until(EC.visibility_of_element_located((By.CSS_SELECTOR, ".media.media--overlay.block-link")))

t0 = time.time()
e = driver.find_element(By.CSS_SELECTOR, "a#homepage-link")
print("By.CSS_SELECTOR " + str(time.time()-t0))

t0 = time.time()
e = driver.find_element(By.ID, "page")
print("By.ID " + str(time.time()-t0))

t0 = time.time()
e = driver.find_element(By.XPATH, "//a[@id='homepage-link']")
print("By.XPATH " + str(time.time()-t0))

t0 = time.time()
e = driver.find_element(By.PARTIAL_LINK_TEXT, "coffin travelling from Balmoral")
print("By.PARTIAL_LINK_TEXT " + str(time.time()-t0))

I have run it several times, the results are varying from run to run, but the typical output is as following:

By.CSS_SELECTOR 0.021941661834716797
By.ID 0.018949031829833984
By.XPATH 0.028922557830810547
By.PARTIAL_LINK_TEXT 0.6597516536712646

I.e. By.CSS_SELECTOR, By.ID and By.XPATH are taking relatively similar access time while By.PARTIAL_LINK_TEXT is mostly takes much more time.

Tested on Version 105.0.5195.102 (Official Build) (64-bit) Google Chrome with latest Selenium 4 version on Windows 10 PC with 16 GB RAM
When tested with FireFox the latest version I saw the similar values.
And again, each measurement is giving different values for each run, but mostly this is the picture, as above.

Prophet
  • 32,350
  • 22
  • 54
  • 79
  • 1
    Any particular reason you're searching by different things in the searches that could be for the same thing. ie, searching for id 'homepage-link', css '#homepage-link', xpath '*[@id='homepage-link']"? You're currently comparing apples to oranges. – Thomas Walpole Sep 11 '22 at 23:05