0
from selenium import webdriver
import requests


driver = webdriver.Chrome()

#login to website using selenium and get cookies

cookievar = driver.get_cookies()


#send requests using cookies scraped from selenium webdriver

r = post(url, formData, headers=headers0, proxies=proxies, verify=False)

my code works fine for one account but now i want this to be able to work with multiple accounts , this script only works for 1 account at a time what i am trying to get is run 3-4 accounts at the same time using webdriver and requests

can anyone help ? suggestions are welcome Thank you

  • You can watch [this](https://docs.python.org/3/library/multiprocessing.html) , if you only use selenium to get cookie , I think it is great. – KC. Oct 07 '18 at 03:25
  • is it possible to run multiple process and get diffrent account cookies using the same webdriver object ? – elrich bachman Oct 07 '18 at 09:03
  • No , you only have one webdriver , which can not do job parallel. And it is not good choice opening multiple webdriver . But multiple `requests.session` is awesome – KC. Oct 07 '18 at 09:25
  • then how can i use one webdriver to login to diffrent accounts ? atm i am only able to use 1 webdriver = 1 account – elrich bachman Oct 07 '18 at 10:19
  • Clear cookies and login again , [how to clear cookies](https://stackoverflow.com/questions/46529761/python-selenium-clearing-cache-and-cookies) , then you need a dict to save different cookies – KC. Oct 07 '18 at 10:29

1 Answers1

1

This is what i thought at comment,but it just a demo. Apart from multiprocessing , threading you can try concurrent.futures.

from selenium import webdriver
import requests
from multiprocessing import Pool

driver = webdriver.Chrome()
clist = {}
#login to website using selenium and get cookies

def relogin(user,pwd):
    ...
    cookievar = driver.get_cookies()
    clist[user] = cookievar
    ...
    driver.delete_all_cookies()

userlist = [("userA","pwdA"),("userB","pwdB")]

for user,pwd in userlist:
    relogin(user,pwd)

driver.close()

#send requests using cookies scraped from selenium webdriver
def post_data(url,formData,proxies,headers = headers0):
    r = post(url, formData, headers=headers, proxies=proxies, verify=False)
    ...

def start(number=len(userlist)):
    p = Pool(number)
    result = p.map_async(post_data , [(url,formData,proxies),
                            (url2,formData2,proxies2,headers2)])
    print(result.get())

if __name__ == '__main__':
    start()
KC.
  • 2,981
  • 2
  • 12
  • 22
  • using this will submit 2 requests at the same time or it waits for one request to complete and then execute another one ? i want it to submit multiple request at the same time btw thnx for your answer – elrich bachman Oct 08 '18 at 08:56
  • 1
    submit 2 requests at the same time , but you can only get the result after all finished. So i suggest add a `Queue` to put result or use `ProcessPoolExecutor` which in `concurrent.futures` , @elrichbachman – KC. Oct 08 '18 at 09:01
  • can you recommend tutorial or guide about this ? i want to be able to use 100 accounts and send 100 requests at the same time is it possible ? – elrich bachman Oct 08 '18 at 09:28
  • 1
    100 requests , i prefer `from concurrent.futures import ThreadPoolExecutor` because it can send them same time and get each result when it complete (100 processes will waste much more memory than threads). But i think you need build up several webdriver to cookies , mulitple threads or processes both can do this @elrichbachman – KC. Oct 08 '18 at 09:37
  • can i add u on skype bro please ? – elrich bachman Oct 08 '18 at 09:40
  • 1
    As my name on SO. I think it would not have the second one – KC. Oct 08 '18 at 10:07