3

How can I cache requests in FastAPI?

For example, there are two functions and a PostgreSQL database:

@app.get("/")
def home(request: Request):
  return templates.TemplateResponse("index.html", {"request": request})


@app.post("/api/getData")
async def getData(request: Request, databody = Body()):
  data = databody ['data']
  
  with connection.cursor() as cursor:
       cursor.execute(
              f'INSER INTO database (ip, useragent, datetime) VALUES ('request.headers['host']', 'request.headers['user-agent']', '{datetime.now()}')
       )
   return {'req': request}

Then the request is processed by JavaScript and displayed on the HTML page .

LW001
  • 2,452
  • 6
  • 27
  • 36
TASK
  • 69
  • 1
  • 1
  • 5
  • 3
    Generally http caching should live outside of the web application for scalability - i.e. do caching in nginx or varnish or something else. Let your application send `Cache-Control`-headers that the reverse proxy/cache server respects. That way those requests will never even hit your API in any way, meaning that FastAPI will never see them unless when necessary (usually when a POST request happens). – MatsLindh Nov 08 '22 at 12:54

1 Answers1

2

you can try fastapi-cache

from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import Response

from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache

from redis import asyncio as aioredis

app = FastAPI()


@cache()
async def get_cache():
    return 1


@app.get("/")
@cache(expire=60)
async def index():
    return dict(hello="world")


@app.on_event("startup")
async def startup():
    redis = aioredis.from_url("redis://localhost", encoding="utf8", decode_responses=True)
    FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
panda912
  • 196
  • 1
  • 6