MemoryCache comes with a Default cache by default and additional named caches can be created.
It seems like there might be advantages to isolating the caching of results of different processes in different instances For example, results of queries against an index could be cached in an "IndexQueryResult" cache and result of database queries in a "DatabaseQueryResult" cache. That's rather contrived but explains the principle.
Does the memory pressure on one cache that results in evictions affect the other caches at all? Are there any differences in the way .Net manages multiple caches compared to how it manages one?
Am I wasting my time considering the idea of multiple caches, or is there real value in doing so?