What is Cachelot Library
If your application needs an LRU cache that works at the speed of light. That's what the Cachelot library is.
The library works with a fixed pre-allocated memory. You tell the memory size and LRU cache is ready.
Small metadata, up to 98% memory utilization.
Besides memory management, Cachelot ensures smooth responsiveness, without any "gaps" for both read and write operations.
Cachelot can work as a consistent cache, returning an error when out of memory or evicting old items to free space for new ones.
The code is highly optimized C++. You can use cachelot on platforms where resources are limited, like IoT devices or handheld; as well, as on servers with tons of RAM.
All this allows you to store and access three million items per second (depending on the CPU cache size). Maybe 3MOPs doesn't sound like such a large number, but it means ~333 nanoseconds are spent on a single operation, while RAM reference cost is at ~100 nanoseconds. Only 3 RAM reference per request, can you compete with that?
There are benchmarks inside of repo; we encourage you to try them for yourself.
It is possible to create bindings and use cachelot from your programming language of choice: Python, Go, Java, etc.
相关文章