29

I intend to use Netty in an upcoming project. This project will act as both client and server. Especially it will establish and maintain many connections to various servers while at the same time serving its own clients.

Now, the documentation for NioServerSocketChannelFactory fairly specifies the threading model for the server side of things fairly well - each bound listen port will require a dedicated boss thread throughout the process, while connected clients will be handled in a non-blocking fashion on worker threads. Specifically, one worker thread will be able to handle multiple connected clients.

However, the documentation for NioClientSocketChannelFactory is less specific. This also seems to utilize both boss and worker threads. However, the documentation states:

One NioClientSocketChannelFactory has one boss thread. It makes a connection attempt on request. Once a connection attempt succeeds, the boss thread passes the connected Channel to one of the worker threads that the NioClientSocketChannelFactory manages.

Worker threads seem to function in the same way as for the server case too.

My question is, does this mean that there will be one dedicated boss thread for each connection from my program to an external server? How will this scale if I establish hundreds, or thousands of such connections?

As a side note. Are there any adverse side effects for re-using a single Executor (cached thread pool) as both the bossExecutor and workerExecutor for a ChannelFactory? What about also re-using between different client and/or server ChannelFactory instances? This is somewhat discussed here, but I do not find those answers specific enough. Could anyone elaborate on this?

Elliot Vargas
  • 20,499
  • 11
  • 34
  • 36
Jiddo
  • 1,256
  • 1
  • 10
  • 15
  • Since NioClientSocketChannelFactory and OioClientSocketChannelFactory are easily replaceable with each other you may just pick any of them right now. After you will be ready to do some performance testing you may switch to another and see if it will give better or worse performance. For very simple case I have done this here: https://gist.github.com/1120694 Note: They are replaceable, but behave a little differently in case of not proper usage - I have a comment about it in the gist mentioned above. – Ivan Sopov Oct 26 '11 at 08:17
  • 1
    @IvanSopov I never really considered using the _Oio_ versions of the ChannelFactories since I know that they use a dedicated thread per connection and I do not feel comfortable having the thread count being directly proportional to the connection count. My worry was that _NioClientSocketChannelFactory_ would also do this (for clients, not for servers), but this has now been disproven. – Jiddo Oct 26 '11 at 23:08

3 Answers3

14

This is not a real answer to your question regarding how the Netty client thread model works. But you can use the same NioClientSocketChannelFactory to create single ClientBootstrap with multiple ChannelPipelineFactorys , and in turn make a large number of connections. Take a look at the example below.

public static void main(String[] args)
{
    String host = "localhost";
    int port = 8090;
    ChannelFactory factory = new NioClientSocketChannelFactory(Executors
            .newCachedThreadPool(), Executors.newCachedThreadPool());
    MyHandler handler1 = new MyHandler();
    PipelineFactory factory1 = new PipelineFactory(handler1);
    AnotherHandler handler2 = new AnotherHandler();
    PipelineFactory factory2 = new PipelineFactory(handler2);
    ClientBootstrap bootstrap = new ClientBootstrap(factory);
    // At client side option is tcpNoDelay and at server child.tcpNoDelay
    bootstrap.setOption("tcpNoDelay", true);
    bootstrap.setOption("keepAlive", true);
    for (int i = 1; i<=50;i++){
        if(i%2==0){
            bootstrap.setPipelineFactory(factory1);
        }else{
            bootstrap.setPipelineFactory(factory2);
        }

        ChannelFuture future = bootstrap.connect(new InetSocketAddress(host,
                port));

        future.addListener(new ChannelFutureListener()
        {
            @Override
            public void operationComplete(ChannelFuture future) throws Exception
            {
                future.getChannel().write("SUCCESS");
            }
        });
    }
}

It also shows how different pipeline factories can be set for different connections, so based on the connection you make you can tweak your encoders/decoders in the channel pipeline.

Reid Spencer
  • 2,776
  • 28
  • 37
Abe
  • 8,623
  • 10
  • 50
  • 74
  • Yes, that is a good point. I knew that you could do this for the server version already, but I did not consider it for the client version. However, will it not still consume a dedicated thread from the _bossExecutor_ for each ClientBootstrap-based connection I createfrom it, just like it does with the ServerBootstrap instances I bind on the server side? Or did I misunderstand how the NioClientSocketChannelFactory works? – Jiddo Oct 26 '11 at 17:39
  • You can actually do this one instance of client bootstrap also. See the updated code. – Abe Oct 26 '11 at 18:00
  • Thank you for confirming that for me. I would still be very much interested in understanding a bit more about what the boss threads do in the client connection case, but at least now my fears about there being a one-to-one relationship have been discarded. – Jiddo Oct 26 '11 at 23:02
  • The netty community nabble is quite active, you could ask it there. Trustin normally answers these q himself. http://www.jboss.org/netty/community – Abe Oct 27 '11 at 03:54
1

I am not sure your question has been answer. Here's my answer: there's a single Boss thread that is managing simultaneously all the pending CONNECTs in your app. It uses nio to process all the current connects in a single (Boss) thread, and then hands each successfully connected channel off to one of the workers.

jpayne
  • 111
  • 1
  • 8
0

Your question mainly concerns performance. Single threads scale very well on the client.

Oh, and nabble has been closed. You can still browse the archive there.

Dominic Cerisano
  • 3,522
  • 1
  • 31
  • 44