1

I'd like to implement a fast, smooth search. Searched items are not that many: ~100 max. Each item holds the amount of data a facebook event would hold. They will all show up on initial load (maybe an infite scroll). Data won't change frequently. No more than 100 concurrent users.

What's the best caching strategy for search results, given above conditions?

What's the most scalable strategy?

Stack

  • Frontend: Nuxt (VueJS) + InstantSearch (no Algolia!)
  • Backend: Spring boot
  • Dockerized

Possible solutions

  1. Extra caching service on the backend (e.g. reddis, memcached) + make UI go to sever on each search operation. This would basically spam the backend on each keystroke
  2. Load all items into local storage (e.g. Vuex) and search there directly. This will increase the app's memory footprint and may turn out messy overtime.
  3. A combination of the two?
Georgian
  • 8,795
  • 8
  • 46
  • 87

1 Answers1

2

A cache layer definitely doesn't hurt. The user amount shouldn't be an issue. Even the smallest ec2-instance on aws could handle that easily.

You could try and add a tiny bit of delay in the textbox so not every keystroke fires a search but maybe give a leeway of ~50ms? Gotta try and see how it feels when typing in the searchbar.

For 100 items Vuex can be quite fast too, as long as you don't load static assets like images directly into Vuex. ~100 items in JSON data isn't a lot - but it doesn't scale as well if your app suddenly has 10000 items.

Best scenario in my opinion:

  • Redis cache because a lot of the requests will be very similar or even identical. You'd just need to find a sweet spot on how long the cache is valid
  • Load balance your backend and frontend i.e. create more instances of your docker-image on demand to handle potential spikes in requests if CPU-usage goes above a certain threshold
  • If your backend does more than just search, outsource that search to a dedicated instance so it doesn't interfere with the "regular requests"
  • Very important: Create indices in your database for faster search results, avoid full scans whereever you can!
  • Maybe think about going serverless if your app doesn't have traffic 24/7

Edit: - have the api, cache and database close by eachother so communication between the instances don't have to travel far.

discolor
  • 1,327
  • 7
  • 15
  • 1
    Brilliant! Any thoughts on another layer of cache on UI local storage? Or is it overengineering, as someone else put it? – Georgian Apr 21 '20 at 11:16
  • I feel like managing multiple layers of caches can be quite tough? Never done it to be honest. Maybe I just personally dislike client caches and go more with the mentality of "never trust the client, only your server" but oh well. Maybe someone else has some interesting points on the client caching. Way out of my comfort zone to have an opinion :) – discolor Apr 21 '20 at 13:02