I have a collection of 10 messages sorted by number of likes message has. Periodically i update that collection with replacing some of the old messages with new that got more likes in meantime, so that collection again contains 10 messages and is sorted by number of likes.
I have api to insert or remove message from collection relative to existing member message. insert(message_id, relative_to, above_or_bellow)
and remove(message_id)
. I want to minimize number of api calls by optimizing position where I insert new messages so that collection is always sorted and 10 long at the end. (in the process length and order is irrelevant, just at the end of process)
I know i can calculate new collection and then replace just messages that dont match their new position but I believe it can be further optimized and algorithms exist already.
Edit:
Note the word "periodically", meaning messages do not come one by one, but in time interval i collect new messages, sort them, and make new collection which i then publish on site via api. So i do have 2 collections, one is simple array in memory, and other is on site.
Idea is to reuse already inserted messages that should be kept and their order in updated collection to save http api calls. I believe there are existing algorithms i could reuse to transform existing collection into already known resulting collection with minimal number of insert, remove operations.