1

I have currently have a code that uses multi-threading and urllib2 to fuzz a web server (GET and POST) but the problem is that every thread keep waiting for the response of the request.

import urllib,urllib2
from threading import Thread

def open_website(opener):
    while True:
        formdata ={"udsan":"fdsf",
            "width":"1200",
            "height":"1920",
            "param":"32",
            "rememberUn":"on"}
        data_encoded = urllib.urlencode(formdata)
        response = opener.open("https://example.com/", data_encoded)


opener = urllib2.build_opener()
opener.addheaders=[("Connection"," keep-alive"),
    ("Cache-Control"," max-age=0"),
    ("Accept"," text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8"),
    ("Accept-Language"," en-US,en;q=0.8,es;q=0.6")]

THREADS=40

for i in range(THREADS):
    t=Thread(target=open_website, args=[opener])
    t.start()

Ho can I do so I just send the request and the thread forgets about the response and do the next request?

The faster the better.

Thank you.

user1618465
  • 1,813
  • 2
  • 32
  • 58
  • how do you know that it waits for the response? Unrelated: each thread should use its own `opener`. Here's several code [examples](http://stackoverflow.com/a/4850200/4279) on [how](http://stackoverflow.com/a/20722204/4279) to [do](http://stackoverflow.com/a/4868866/4279) concurrent [requests](http://stackoverflow.com/a/31795242/4279) in [Python](https://gist.github.com/zed/0a8860f4f9a824561b51). Consider using an http client that allows to send several http request over the same tcp connection (aiohttp, requests). Though with some server you can make more requests with 1 req. per connection. – jfs Aug 06 '15 at 18:44

0 Answers0