Comment by locofocos
Can you pitch me on why I would want to use this, instead of Rails.cache.fetch (which supports TTL) powered by redis (with the "allkeys-lru" config option)?
Can you pitch me on why I would want to use this, instead of Rails.cache.fetch (which supports TTL) powered by redis (with the "allkeys-lru" config option)?
Good question. I built this gem because I needed a few things that Rails.cache (and Redis) didn’t quite fit:
- Local and zero-dependency. It caches per object in memory, so no Redis setup, no serialization, no network latency. -Isolated and self-managed. Caches aren’t global. Each object/method manages its own LRU + TTL lifecycle and can be cleared with instance helpers. - Easy to use — You just declare the method, set the TTL and max size, and you're done. No key names, no block wrapping, no external config.
For what its worth, ActiveSupport::CacheStore is a really flexible api that gives minimal contractual obligations (read_entry, write_entry, delete_entry is the entire set of required methods), but still allows you to layer specific functionality (eg TTL) on top with an optional 'options' param. You could get the best of both worlds by adhering to that contract and then people can swap in eg redis cache store if they wanted a network-shared store.
EDIT: see https://github.com/rails/rails/blob/main/activesupport/lib/a...
That's actually a really good idea! I'll definitely consider this in a future update. Thanks!
I'm not OP nor have I read through all the code, but this gem has no external dependencies and runs in a single process (as does activesupport::Cache::MemoryStore). Could be a "why you should," or a "why you should not" use this gem, depending on your use case.