1

I have a website that uses a pretty slow external API (0.9 seconds for a request). The results from this API request are rendered to the page.

I use some kind of own caching, because I store the results in a DB and subsequent queries for the same resource are queried from the DB rather than requesting from the API again. If the data in the DB is too old (>10 Minutes), I update the DB with a new API request.

It will be pretty common to check the website only occasionally during the day, so you will always hit the 10 Minute limit and always have a pretty long loading time >1s. This feels very unresponsive.

I then searched for ways to get around the loading time and found this. I think this could be the right direction, but I am still not confident on how to tackle the task. Can anybody point me in the right direction as how to implement this?

Should I use the low level cache api?

Could I use the default cache? Or should I implement my own version?

Do you think the solution provided in the first link is a good idea at all?

stulleman
  • 163
  • 2
  • 14
  • Are you sure this is not the *N+1* problem: if you render *related* objects, this results in one query per related object if you do not `.select_related` or `.prefetch_related`. – Willem Van Onsem Oct 09 '18 at 13:31
  • Yes I am sure, I optimized my queries with the Django Debug Toolbar. My bottleneck is the external API. – stulleman Oct 09 '18 at 13:36
  • 1
    If the data loads within 3-5s, I wouldn't bother going through the ajax progress bar you mention (also because you can probably not estimate how much progress there is if you're just waiting on a 3rd party response). What you should do is ensure the web page itself loads without the data from the external API. You fetch that data with a separate AJAX request, showing a loading indicator (spinner) until that request returns. That way the user can still browse the page and click to do something else even if the results aren't there. If it takes 1-2s this should be perfectly satisfying. – dirkgroten Oct 09 '18 at 14:31
  • Or, if the external data isn't very large to fetch, why not run a separate task that refreshes it every 10 min? (assuming that's not against the policy of the 3rd party) – dirkgroten Oct 09 '18 at 14:33
  • Unfortunately the 3rd party API has a rate limit and is ip bound. So only my webserver can make those requests. I thought of exposing my own API and then using ajax to to load the data on the fly. Refreshing every 10 min is not an option either, as there is way to much data. Would you through the struggle of implementing your own API to download the data asynchronous with ajax? – stulleman Oct 09 '18 at 16:27

0 Answers0