I have made a chat application where for now I am storing complete history of chat of all the users.
I am using django as backend and postgres as the database. I am nearing 100k daily active users which makes around 1 million messages per day.
So I am wondering how to scale the postgres data horizontally? I have heard that sharding is not simple in SQL databases and also they have limit to scaling. Like I have heard that Google's big table can scale to 100 of petabytes while postgres is hard to scale to that level. Is it true? If not, how to scale at moment? Also, how to tackle with messages history, they will eventually get too big to handle?
Another question is should I shift to another dataset to handle scaling like mongodb or Cassandra or anything else, because it makes me fear that eventually I will have to scale to billions of messages per month level and if I can shift now that would be better. I don't want to over think or over analyse but just want to get perspective of how to go about it