This project is read-only.


Jan 28, 2014 at 4:43 PM
Edited Jan 28, 2014 at 4:43 PM
Another opportunity for making the cache more useful is to go ahead and define a limit on the size of storage so the cache can automatically ensure that the cache doesn't try to use more than the available space and start throwing errors. See for example the implementation of MemoryCache.CacheMemoryLimit.

I think this one is actually more important than parallelizing the file system accesses. I'm getting files in and out in a few milliseconds. When I create the cache in a location with a defined size, the app stops working as the number of items in the cache grows.

Jan 29, 2014 at 3:38 PM
That makes sense. I've been toying around with a similar idea regarding polling for expired data as FC leaves a lot of orphaned records. Unfortunately, I'm tied up in another project so I don't know how soon I'll be able to get to your suggestions. If you need the changes right away, feel free to download the source and submit a patch :)
Feb 24, 2014 at 1:06 AM
With a memory cache, you can instantly figure out how much memory your cache is consuming. With files stored on disk, you'd have to do a complete traversal of the entire cache to figure out its size. As such, I'm a little worried about a slowdown when the cache calculates / recalculates its size. Do you have any suggestions?
Feb 24, 2014 at 1:49 AM
Keep a list as they come in. Cache the list. only scan under extreme circumstances. I'm working on such an approach, but I have several other projects going on.

Sent from my Windows Phone

Feb 24, 2014 at 1:57 AM
Okay, I've got something implemented and will push soon. Here's how it works:
  • On object creation, check current size of the cache (optional)
  • On item write, remove old size of cache items, add in new size.
  • If current cache size > maximum specified cache size, raise MaxCacheSizeReached event, which can be handled by user code
I've also added a Flush() method that will remove all cache entries based on their last access date. It's a little more explicit than what MemoryCache is doing, but you have more control on how much information is deleted.