1

Python dash/flask app with working memoization on non-callback functions. However, when I introduce a background_callback_manager the memoization fails between calls. i.e., if a callback will call a memoized function twice, it will successfully use the cached data, if the callback is triggered again, it has to re-cache the memoized function.

My hypothesis is there is some session variable getting hashed into the cache key, but I'm not familiar with the backends and the docs are sparse.

The reason I am using both method of caching is because I wanted to use the progress argument of dash.callback(...,progress=[],...)

My setup code for the caching systems is attached.

launch_uid = uuid4()

EXPIRE = 600 

if "REDIS_URL" in os.environ:
    # Use Redis & Celery if REDIS_URL set as an env variable
    from celery import Celery

    celery_app = Celery(__name__, broker=os.environ['REDIS_URL'], backend=os.environ['REDIS_URL'])
    background_callback_manager = CeleryManager(
        celery_app, cache_by=[lambda: launch_uid], expire=EXPIRE
    )

else:
    # Diskcache for non-production apps when developing locally
    import diskcache

    cache = diskcache.Cache("./cache")
    background_callback_manager = DiskcacheManager(
        cache, cache_by=[lambda: launch_uid], expire=EXPIRE
    )

# Initialize dash app
app = dash.Dash(
    __name__,
    background_callback_manager=background_callback_manager
)

# data caching (https://dash.plotly.com/sharing-data-between-callbacks)
flask_cache = Cache(
    app.server,
    config={
        "CACHE_TYPE": "RedisCache",
        # Note that filesystem cache doesn't work on systems with ephemeral
        # filesystems like Heroku.
        "CACHE_TYPE": "FileSystemCache",
        "CACHE_DIR": "cache-directory",
        # should be equal to maximum number of users on the app at a single time
        # higher numbers will store more data in the filesystem / redis cache
        "CACHE_THRESHOLD": 200,
        "CACHE_DEFAULT_TIMEOUT": 0,
    },
)
doyle
  • 53
  • 1
  • 4

0 Answers0