1

I have to design and implement a service delivery platform. I have various services in my current design and all of those tools are using different technologies. Some are erlang based concurrent map-reduce functions and some are simple bash scripts to aggregate some text files.

I heard about XML/RPC, Protocol Buffer, message-pack, soup and AMQP. currently I use JSON, but loading and dumping large json files are a bit time/memory consuming. Is there any new or robust way to make a bridge between various technologies on HTTP infrastructure with wide range programming language support and well documentation?

I also need to mention that i believe complexity is much more corrosive than latency problems or other connection related issues. So the JSON replacement must not add complexity to design.

Paul Roub
  • 36,322
  • 27
  • 84
  • 93
Farshid Ashouri
  • 16,143
  • 7
  • 52
  • 66
  • if your JSON file will be large, you are guaranteed to suffer the penalty from HTTP latency anyway. Are the service calls going to change dynamically and very frequently? – Anzel Oct 25 '15 at 23:12
  • yes, for example, Authorization service will be called 2 million times a day. and not all json files, but some of them are large. – Farshid Ashouri Oct 25 '15 at 23:16
  • if your stack is bound to HTTP infrastructure, my gut feeling is that you'd better go with a database (NoSQL) as a communication platform, with eventual consistency, caching and async service calls to minimize the bottleneck. Of course I'm not familiar to what exactly your micro-services do so this is just an option for you to consider. – Anzel Oct 25 '15 at 23:22
  • Thanks @Anzel, The main problem is the communication protocol. I am happy with json but out of curiosity, I am looking for new techniques. – Farshid Ashouri Oct 25 '15 at 23:24
  • 1
    Not a problem, FYI, "one" communication protocol won't be enough if your micro-services will each rely on not only 1 another service (or partially), ie. you will have to deal with queuing, race condition and I/O etc... which IMHO it's more about a communication strategy than protocol. Anyhow, good luck and all the best ;) – Anzel Oct 25 '15 at 23:28
  • I am solving the race condition problem with a workflow engine (again as a service). Every process that takes more than 500ms will be send to a workflow engine and it's state will be saved and the original service will be notified using callbacks. But you are right. My question is communication strategy rather than protocol – Farshid Ashouri Oct 26 '15 at 00:05

1 Answers1

1

If you don't need to persist your data, you could also take a look at Redis and its pubsub features. It is mature, really simple to configure and use, great documentation and a big community.

Here's a list of available client libraries (5 Erlang libs for example) http://redis.io/clients

hampusohlsson
  • 10,109
  • 5
  • 33
  • 50