4

I am new to the world of microservice and I have tried to learn about it and how it could be apply to my needs. I need to design a cloud plaform easily maintenable and scalable with the following (as far as I see them) :

  • Rails API + PostgreSQL (microservice 1)
  • Frontend framework (microservice 2)
  • Some Python script (microservice 3)
  • Some other Python script (microservice 4)

Inspired by this question & answer, each microservice is a separate Heroku app. What about the security between them when they talk to each other and the response time?

Also, since the service is meant to grow, it would be expensive sooner or later, how to optimize cost in this situation ? I just discovered CaptainDuckDuck but I'm afraid of the "lack" of experience from its user base since it's quite new and not as much popular as other PaaS. Is the only solution is to go to something like DigitalOcean or AWS EC2 and manage by ourselves the job that Heroku does ?

Because doing microservice like this, is not really a microservice design since all the services are not hosted on the same machine, am I right ? A more microservice-friendly approach would be to use Heroku Private Spaces (even if that doesn't answer the cost issue) ?

For information, I have this design already up and running. So it's not a matter of "will it work?", but more "is it the right way?".

Thanks for your feedbacks

TylerH
  • 20,799
  • 66
  • 75
  • 101
ZazOufUmI
  • 3,212
  • 6
  • 37
  • 67

1 Answers1

2

In theory, you could indeed deploy each of your microservices as a completely separate Heroku app, as you suggested.

However, depending on your requirements, a MUCH simpler, possibly better and almost definitely cheaper approach may be to deploy them as separate microservices in one polyglot Heroku app, using Heroku dynos.

For example, you could run your Rails API as the web dyno of your single app. In that case you might want it to also serve your front-end framework.

You should consider using Heroku Postgres as a managed DBaaS. It will be a breeze to connect your Rails web dyno to your Heroku Postgres instance.

You might then want to define each of your python scripts as a separate process type in your Procfile. You need to do so if you need them to be "always on". Alternatively, depending on your requirements, you might want to consider using one-off dynos for your python scripts. At any rate, your python scripts will run on separate dynos.

Note that each of your process types can be separately scaled.

One thing you need to consider is how your microservices interact (if you need them to do so). There are many ways to approach this, but note that only your Web dyno instances can listen on http/s traffic. See here for some ideas on this.

Yoni Rabinovitch
  • 5,171
  • 1
  • 23
  • 34
  • Thanks for your detailed feedback. *I've edited my question because my design described is already up and running.* So your advice would be to embed all my microservices (except for the frontend) inside only one Dynos. Web dyno would be Rails, and 2x worker dynos would be my two Python scripts. I cut cost, and avoid external communication between service then. But in this case, this is only one git repo that you have to push and rebuild every time ?! – ZazOufUmI Aug 21 '18 at 07:06
  • My advice might be to embed all your microservices inside one Heroku app (definitely NOT inside one Heroku dyno). As for git repos: The simplest method is to indeed manage a single Heroku app as a single git repo. However, if that is problematic, you could consider managing each of your microservices in its own git subtree (see https://www.atlassian.com/blog/git/alternatives-to-git-submodule-git-subtree). In that case you could have a parent git repo that would simply import all your subtrees, and you would push the parent repo to your Heroku app. – Yoni Rabinovitch Aug 21 '18 at 07:15
  • Yes sorry I meant one Heroku App not one Dyno. Thanks I will consider your approach and try it – ZazOufUmI Aug 21 '18 at 07:42
  • Microservices shouldn't interact period. One API should not call into another API. If two APIs are performing similar functions, wrap that functionality as a shared package that either of the APIs can install and leverage to produce a consistent operation. Also, any out-of-bound processes should be scheduled via a messaging construct. – Jakub Keller Apr 12 '22 at 19:08