Imagine the following scenario:
from flask.ext.cache import Cache
@cache.memoize
def my_cached_func():
x = load_some_big_csv()
return x
@app.route('')
def index():
get_maybe_cached = my_cached_func()
return get_maybe_cached
with SimpleCache as a backend.
A) If i have 16 cores, and a user hits core 1, will the cache only be filled on this 1 core or will the cached value be accessible from all cores? Based on the principle of python-multiprocessing it will be only cached on 1 core, right?
Therefore if 16 users hit my endpoint i ll end up with 16x the same cache value stored in my ram.
B)
What happens if 10 users hit endpoint 'x' on the same core1. Will flask now get the pointer that references the value (No additional memory allocated for 10 request) in memory or the value itself (Memory consumption 10x during the 10 sequential but close to parallel requests) ?