Questions tagged [lru]

LRU is a family of caching algorithms, where LRU stands for least recently used.

LRU is a family of caching algorithms, where LRU stands for least recently used. As the name implies, a cache discards the least recently used items from the cache, when the contents exceed a certain threshold. In this algorithm, writes are quite expensive, as you always need to write the change the usage information when you use an item.

More information at Wikipedia.

304 questions
175
votes
21 answers

How would you implement an LRU cache in Java?

Please don't say EHCache or OSCache, etc. Assume for purposes of this question that I want to implement my own using just the SDK (learning by doing). Given that the cache will be used in a multithreaded environment, which datastructures would you…
Hank Gay
  • 70,339
  • 36
  • 160
  • 222
112
votes
3 answers

How does Lru_cache (from functools) Work?

Especially when using recursive code there are massive improvements with lru_cache. I do understand that a cache is a space that stores data that has to be served fast and saves the computer from recomputing. How does the Python lru_cache from…
innicoder
  • 2,612
  • 3
  • 14
  • 29
89
votes
9 answers

Python functools lru_cache with instance methods: release object

How can I use functools.lru_cache inside classes without leaking memory? In the following minimal example the foo instance won't be released although going out of scope and having no referrer (other than the lru_cache). from functools import…
televator
  • 1,133
  • 1
  • 9
  • 8
88
votes
14 answers

LRU cache design

Least Recently Used (LRU) Cache is to discard the least recently used items first How do you design and implement such a cache class? The design requirements are as follows: 1) find the item as fast as we can 2) Once a cache misses and a cache is…
user297850
  • 7,705
  • 17
  • 54
  • 76
86
votes
4 answers

What is the difference between LRU and LFU

What is the difference between LRU and LFU cache implementations? I know that LRU can be implemented using LinkedHashMap. But how to implement LFU cache?
Javadroider
  • 2,280
  • 1
  • 22
  • 45
78
votes
2 answers

Make @lru_cache ignore some of the function arguments

How can I make @functools.lru_cache decorator ignore some of the function arguments with regard to caching key? For example, I have a function that looks like this: def find_object(db_handle, query): # (omitted code) return result If I…
WGH
  • 3,222
  • 2
  • 29
  • 42
76
votes
5 answers

Easy, simple to use LRU cache in java

I know it's simple to implement, but I want to reuse something that already exist. Problem I want to solve is that I load configuration (from XML so I want to cache them) for different pages, roles, ... so the combination of inputs can grow quite…
Juraj
  • 1,951
  • 2
  • 21
  • 30
64
votes
8 answers

How to limit the size of a dictionary?

I'd like to work with a dict in python, but limit the number of key/value pairs to X. In other words, if the dict is currently storing X key/value pairs and I perform an insertion, I would like one of the existing pairs to be dropped. It would be…
anthony
  • 40,424
  • 5
  • 55
  • 128
50
votes
3 answers

Python LRU Cache Decorator Per Instance

Using the LRU Cache decorator found here: http://code.activestate.com/recipes/578078-py26-and-py30-backport-of-python-33s-lru-cache/ from lru_cache import lru_cache class Test: @lru_cache(maxsize=16) def cached_method(self, x): …
crzysdrs
  • 503
  • 1
  • 4
  • 4
38
votes
10 answers

LRU cache implementation in Javascript

Java has LinkedHashMap which gets you 99% there to an LRU cache. Is there a Javascript implementation of an LRU cache, preferably from a reputable source, that is: understandable efficient (amortized O(1) get/put/delete) ? I've been searching on…
Jason S
  • 184,598
  • 164
  • 608
  • 970
38
votes
13 answers

Is it there any LRU implementation of IDictionary?

I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism. Coming from java, I have experiences with LinkedHashMap, which works…
Antonello
  • 1,326
  • 2
  • 16
  • 23
35
votes
3 answers

Use functools' @lru_cache without specifying maxsize parameter

The documentation for lru_cache gives the function definition: @functools.lru_cache(maxsize=128, typed=False) This says to me that maxsize is optional. However, it doesn't like being called without an argument: Python 3.6.3 (default, Oct 24 2017,…
Tom Hale
  • 40,825
  • 36
  • 187
  • 242
35
votes
6 answers

Use LinkedHashMap to implement LRU cache

I was trying to implement a LRU cache using LinkedHashMap. In the documentation of LinkedHashMap (http://docs.oracle.com/javase/7/docs/api/java/util/LinkedHashMap.html), it says: Note that insertion order is not affected if a key is re-inserted…
Lei Chen
  • 699
  • 1
  • 5
  • 13
29
votes
6 answers

LRU implementation in production code

I have some C++ code where I need to implement cache replacement using LRU technique. So far I know two methods to implement LRU cache replacement: Using timeStamp for each time the cached data is accessed and finally comparing the timeStamps at…
sud03r
  • 19,109
  • 16
  • 77
  • 96
18
votes
2 answers

Get the last time a given Redis key was accessed

I'd like to view the time of most recent access for a specific key on my redis server. I know that this information is stored for each key because it is used in redis's LRU algorithm for eliminating old keys. Is there an easy way to see this…
Peter Berg
  • 6,006
  • 8
  • 37
  • 51
1
2 3
20 21