I want to do some performance testing on one of our web servers, to see how the server handles a lot of persistent connections. Unfortunately, I'm not terribly familiar with HTTP and web testing. Here's the Python code I've got for this so far:
import http.client
import argparse
import threading
def make_http_connection():
conn = http.client.HTTPConnection(options.server, timeout=30)
conn.connect()
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument("num", type=int, help="Number of connections to make (integer)")
parser.add_argument("server", type=str, help="Server and port to connect to. Do not prepend \'http://\' for this")
options = parser.parse_args()
for n in range(options.num):
connThread = threading.Thread(target = make_http_connection, args = ())
connThread.daemon = True
connThread.start()
while True:
try:
pass
except KeyboardInterrupt:
break
My main question is this: How do I keep these connections alive? I've set a long timeout, but that's a very crude method and I'm not even sure it affects the connection. Would simply requesting a byte or two every once in a while do it?
(Also, on an unrelated note, is there a better procedure for waiting for a keyboard interrupt than the ugly while True:
block at the end of my code?)