0

I have an Erlang application which spawns several (possibly thousands) processes. Each of these processes makes an http request to a remote service receiving back a json.

Once the process receives the json, it will store it on redis or send it over a rabbitmq queue where some consumer will deal with it (I am still not sure).

Since for both redis and rabbitmq I have to open a connection, I was wondering whether is better to open and close the connection in each process or to have some kind of server keeping the connection which will be called by each process.

user601836
  • 3,215
  • 4
  • 38
  • 48
  • Does this answer your question? http://stackoverflow.com/questions/10407760/is-there-a-performance-difference-between-pooling-connections-or-channels-in-rab – Bengt Feb 10 '14 at 12:22
  • so I should have only one process keeping the connection and all the other processes should send using it – user601836 Feb 10 '14 at 12:58
  • I used one connection with each process opening an own channel within this connection. However, I had less than 100 processes running simultaneously. – Bengt Feb 10 '14 at 14:05

2 Answers2

1

I would go with the separate servers like redis_storage and rabbitmq_storage. Both would be simple servers that manage their connections and have functions like store/1. According to the responsibility principle everyone has its own duty. You can first implement the redis storage and test everything out and if you change your mind then implement the rabbitmq storage and the only modification is to change the module name. Or you can use them both.

ten0s
  • 839
  • 8
  • 11
0

You should definitely use a separate connection pool for redis/rabbitmq.

Have you heard of poolboy? https://github.com/devinus/poolboy

That's the tool for the job.

loucash
  • 1
  • 1