We have an API (ASP.NET Web Api 2, to be specific) which needs to traverse through a very large collection of entities in some algorithmic fashion in order to research the desired result.
This collection is the same for every query performed. It forms the data "source" for every call.
Properties of the collection:
- Ordered
- Large (hundreds of thousands of objects)
- Read-only
- Must be independently traverseable by concurrent threads from the application pool
- The items in the collection are objects consisting of only a few value types.
Reading this collection from disk or database every time a request to the API is made results in a lot of overhead. So, we persist it statically, almost as a singleton. This seems to work as static objects are shared between all requests to the application and have the same lifetime as the application domain. This results in almost instant requests from the API.
Is there perhaps a better pattern, practice or framework for such a problem?