Is It There Any Lru Implementation of Idictionary

Is it there any LRU implementation of IDictionary?

There is nothing in the base class libraries that does this.

On the free side, maybe something like C5's HashedLinkedList would work.

If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.

Sorted Dictionary sorted on value in C# (LRU cache)

What you should do is keep two dictionaries, one sorted by time and one by keys.

Remember that dictionaries are only holding references to your actual objects, so which dictionary you use to update the object doesn't matter.

To update the object create a function that will update both the dictionaries

var oldObj = keyedObject[key];
timedObjects.Remove(oldObj.LastUpdateTime);
timedObjects.Add(myUpdatedObject.LastUpdateTime,myUpdatedObject);
keyedObject[key] = myUpdatedObject;

Now you have track of the same object by both time and key.

I am keeping only one reference to an object in timedObjects. This helps while removing.

You can keep trimming your timedObjects dictionary as required.

Ofcource, while trimming you must bear in mind that there is another dictionary keyedObject that has reference to the same object. Merely calling Remove will not be enough.

Your remove code will have to be like this:

removeObject = timedObjects[timeToRemove];
timedObjects.Remove(timeToRemove);
keyedObject.Remove(removeObject.key);

timeToRemove will mostly come from a for loop, where you decide which object to remove

Best choice for growing, fixed capacity generic sequence

It really depends on what you specifically mean by "oldest entry".

If you are looking for a FIFO structure, then it is possible to extend the ConcurrentQueue<T> to eject the oldest item (first item entered). (Copied from this answer).

public class FixedSizedQueue<T> : ConcurrentQueue<T>
{
private readonly object syncObject = new object();

public int Size { get; private set; }

public FixedSizedQueue(int size)
{
Size = size;
}

public new void Enqueue(T obj)
{
base.Enqueue(obj);
lock (syncObject)
{
while (base.Count > Size)
{
T outObj;
base.TryDequeue(out outObj);
}
}
}
}

If you are looking for a cache that keeps track of when the last item was accessed, and ejects the one that was accessed least recently (sort of like the sliding expiration functionality in System.Runtime.Caching), you could use a Least Recently Used (LRU) cache.

There is a high performance thread-safe .NET implementation of one called the LurchTable in the CSharpTest.Net.Collections project, which is available on NuGet.

Introducing the LurchTable as a C# version of LinkedHashMap

For other options, see

  • Is it there any LRU implementation of IDictionary?
  • LRUCache.NET
  • A High Performance Multi-Threaded LRU Cache
  • LRU Cache with C#
  • LRUPractice.cs

LRU cache with a singly linked list

You are correct, the single linked list can be used instead of the double linked list, as can be seen here:

The standard way is a hashmap pointing into a doubly linked list to make delete easy. To do it with a singly linked list without using an O(n) search, have the hashmap point to the preceding node in the linked list (the predecessor of the one you care about, or null if the element is at the front).

Retrieve list node:
hashmap(key) ? hashmap(key)->next : list.head

Delete:
successornode = hashmap(key)->next->next
hashmap( successornode ) = hashmap(key)
hashmap(key)->next = successornode
hashmap.delete(key)

Why is the double linked list so common with LRU solutions then? It is easier to understand and use.

If optimization is an issue, then the trade off of a slightly less simple solution of a single linked list is definitely worth it.

Using @functools.lru_cache with dictionary arguments

Instead of using a custom hashable dictionary, use this and avoid reinventing the wheel! It's a frozen dictionary that's all hashable.

https://pypi.org/project/frozendict/

Code:

from frozendict import frozendict

def freezeargs(func):
"""Transform mutable dictionnary
Into immutable
Useful to be compatible with cache
"""

@functools.wraps(func)
def wrapped(*args, **kwargs):
args = tuple([frozendict(arg) if isinstance(arg, dict) else arg for arg in args])
kwargs = {k: frozendict(v) if isinstance(v, dict) else v for k, v in kwargs.items()}
return func(*args, **kwargs)
return wrapped

and then

@freezeargs
@lru_cache
def func(...):
pass

Code taken from @fast_cen 's answer

Note: this does not work on recursive datastructures; for example, you might have an argument that's a list, which is unhashable. You are invited to make the wrapping recursive, such that it goes deep into the data structure and makes every dict frozen and every list tuple.

(I know that OP nolonger wants a solution, but I came here looking for the same solution, so leaving this for future generations)



Related Topics



Leave a reply



Submit