In this post, I describe how to use MemoryCache and track lifetime of entities in .NET applications.
Intro
Caching is one of famous ways to improve a performance of an application and to decrease a load on a data provider. In .NET there are packages Microsoft.Extensions.Caching to use different storage : Redis, SqlServer, Cosmos, Memory. Last one is the simplest method. It just saves a data in RAM. You can use it just installing Microsoft.Extensions.Caching.Memory from NuGet.
Using MemoryCache
To keep data, it uses innerConcurrentDictionary
. So operations set/get are thread-safe. An interaction with MemoryCache is described in IMemoryCache
. This interface has several methods to set, remove and get a value. The usual algorithm to work with a cache is:
1. Try to get a value from the cache.
2. If you didn’t get the value from the cache, request to a data source and save received value in the cache.
Simple example:
We should to be careful in using memory and prevent from uncontrolled allocation. Therefore, in the above code we limit cache size, lifetime of entities by MemoryCacheEntryOptions
.
Tracking entities
The one important part of caching is an update of data in a cache store. In the above code, we check existence of entity during receiving a value. If it doesn’t exist, we get value from data provider and then save it in the cache. Also for this purpose IMemoryCache
extension has method GetOrCreate<TItem>(Object key, Func<ICacheEntry,TItem> factory)
.
I should mention work expiration mechanism in MemoryCache
. When we try to get a value from a cache. The last one receives a value from an inner dictionary and check if it expired. If yes, the entity is marked as expired and then removed. If no, then the cache just returns the value. In both cases, the cache starts an invalidation process for all values in the inner dictionary in a separate thread. The period of this checking you can set up in ExpirationScanFrequency
filed of MemoryCacheOptions
(by default, 1 minute). Also, MemoryCache has compacting logic. It removes entities in the following order:
1. All expired items.
2. Items by priority. Lowest priority items are removed first.
3. Least recently used objects.
4. Items with the earliest absolute expiration.
5. Items with the earliest sliding expiration.
Pinned items with priority NeverRemove are never removed. Due to these reasons, we have to track values in a cache and update it. The simplest way is checking value by TryGetValue
method during receiving it from a cache.
Issue: updating value in MemoryCache
We can see in the above code, it is possible to call GetOrUpdate
from multiple threads and a cache value will be updated multiple times in these threads — race condition. There are different ways to avoid it:
Using synchronization primitives. For example, you can use
SemaphoreSlim
:
Also, we can use Interlocked
methods like here. There is a lock per key approach. Anyway, this get or update logic breaks a single responsibility principle. You get value and sometimes set new value in a single method.
2. Using Lazy<T>
approach. Put in Lazy<T>
instance instead of T
in a cache. This way is described in StackOverflow. There is implementation of LazyCache also. The lazy approach is better than the previous. It allows keeping the single responsibility principle and use an instance from cache when it is actually necessary.
3. Using Callbacks. Official documentation mentions eviction callbacks. Let’s see what it is. Each entity inMemoryCache
has callbacks collection of PostEvictionCallbackRegistration
type, that are invoked after eviction value from the cache in the separate thread. You can add your delegate by RegisterPostEvictionCallback
method in MemoryCacheEntryOptions
argument of Set
method:
Eviction reason sets in EvictionReason.Removed
after removing and EvictionReason.Expired
after checking lifetime. When cache entity was updated, eviction reason will be EvictionReason.Replaced
In case removing due to capacity overflow, it is EvictionReason.Capacity
A simple example:
It allows starting background update evicted values and split get/set logic — single responsibility principle. But you need to remember about race condition. CacheEntity
allows tracking changes by using IChangeToken
. I omit this topic, since it has the similar approach and I don’t pretend on comprehensive tutorial (Definitely, it would be a long read). You can read here about using change token.
Summary
It this post, I touched some important issues of using memory cache:
- You have to limit size and entities lifetime. It allows reducing memory usage.
- Cache invalidation is triggered during an attempt to get a value. It uses background thread to check lifetime and clean expired entities in a cache.
- MemoryCache is thread-safe, but doesn’t prevent race condition for Set
method. So I described several approaches to solve this problem. Each has pros and cons, choose the best for your case. Also, I saw an interesting async approach based on TaskCompletionsSource
. Learn more you can get from the list below.
P.S. Read a source code. Thanks to Open Source policy, it is a good way to learn more than from documentation.
Resources: