1

I am using TcpListener (Clase) example https://msdn.microsoft.com/es-es/library/system.net.sockets.tcplistener(v=vs.110).aspx in order to process TCP requests.

But It seems like at the same time this TCP Listener is gonna accept multiple requests that should be processed later in a couple of Web Services together and result must be returned to the TCP client.

I am thinking to do following:

  1. Get a stream object for reading and writing NetworkStream stream = client.GetStream(); and save it in special container class.

  2. Put this class to special Queue helper class like this one C#: Triggering an Event when an object is added to a Queue.

  3. When Queue is changed fire implemented event to process the next queue item asynchronously using Task.

  4. Within a Task communicate with Web Services, and send the response to TCP Client.

Please, let me know this architecture is vital and able to resolve the multiple requests to TCP Listener.

Community
  • 1
  • 1
NoWar
  • 36,338
  • 80
  • 323
  • 498
  • 2
    And why queue them? Then all subsequent requests will wait until all previous are completed. – Evk Oct 05 '16 at 13:41
  • @Evk Well... I am not sure about Queue and I am thinking that I need kind of BUFFER to keep `NetworkStream stream = client.GetStream();` fastly... Do you think is enough to create `Task` and keep all logic over there? – NoWar Oct 05 '16 at 13:46
  • 1
    Well it depends on the load I think. At some cases I think queue is good solution, especially in cases when what you do with each item is CPU bound (some heavy computations). But here things are IO bound (you make web requests), so you should be able to handle a lot of them without queue. So if load on this service will not be very high - I think yes, processing stuff in parallel (with regular Tasks) should be fine. – Evk Oct 05 '16 at 13:48
  • 1
    you could use a raw socked on that port. But i wouldnt recommend that – Sebastian L Oct 05 '16 at 13:48
  • 1
    And if you go queue route still - at least make many threads (say 16) consume from it and process requests, not a single thread. – Evk Oct 05 '16 at 13:49

2 Answers2

2

I'd recommend you netmq. Have a look https://github.com/zeromq/netmq

David Soler
  • 192
  • 1
  • 10
1

Using queue is definetly viable idea, but consider what purpose it serves. It limits how many requests you can process in parallel. You may need to limit that in several cases, and most usual is if each request processing performs CPU-bound work (heavy computations). Then your ability to process a lot of them in parallel is limited, and you may to use queue approach.

In your request processing performs IO-bound work (waiting of web request to complete). This does not consume much server resources, and you can process a lot of such requests in parallel, so most likely in your case no queue is needed.

Even if you use queue, it's very rarely useful to process just one item at a time. Instead, process queue with X threads (where X again depends on if work is CPU or IO bound, for CPU you might be fine with X = number of cores, with IO you need more). If you use too few threads to process your queue - your clients will wait more for basically nothing, and can even fail by timeout.

Evk
  • 98,527
  • 8
  • 141
  • 191