I am writing a Python TCP/IP server that accepts messages from one client. Once, I receive the message, I parse the TCP message and generate a response based on its message type and send the generated response to the Client.
My applications works very well, until you have multiple messages coming in very quickly. My server seems to be blocking I/O while it is in the middle of processing a message. For example, while I am processing the 3rd message, the 4th message will never be accepted by my server, since, I am still processing the 3rd one. Therefore, the 4th message gets lost in space.
I was thinking about handling this issue using threading. Thread1
can sit there and only accept the incoming messages from the client and put them into the Queue, while Thread2
will read the request message from Thread1
and generate a response to be sent to the client. After it sends the message, Thread2
will pop the message from the Thread1
's Queue.
Is this the right away to go about this issue?
EDIT: After some more testing, I realized that I am always receiving the Client's messages; however, the Client times out after 5-10 seconds if it does not receive a response message. I need to speed up the message processing because my server thinks everything is fine because it actually receives all of the messages and does not lose any messages as I thought before. The problem is that the client is impatient. Any tips on how to speed up performance? A process per request that runs in parallel so the work is divided equally? The only issue I see with that is, that it is rare that a message comes in while I'm processing another message.