Skip to content

Latest commit

 

History

History
51 lines (31 loc) · 1.33 KB

README.md

File metadata and controls

51 lines (31 loc) · 1.33 KB

sieve

a simple yet efficient cache, original introduction

algo animation

Usage

for Node.js, install via npm: npm install @zf/sieve

import { SieveCache, LRUCache, LFUCache } from '@zf/sieve'

const cache = new SieveCache<string>(3 /* capacity */)
cache.set('key', 'value')
cache.get('key')

for Deno

import { SieveCache, LRUCache, LFUCache } from "https://deno.land/x/sieve/mod.ts"

Benchmark

Benchmark reading 1 million normally distributed items through a cache with a capacity of 100 compared with the LRU package showing SIEVE is more performant, while the cache hit/miss ratio is about the same:

chart

It turned out that the LRU package's implementation is not very efficient, so I wrote my own LRU, and it is actually better than SIEVE:

chart

Anyway, the cache hit/miss ratio is of much greater importance, and it is determined by the data distribution.

Added LFU, which has the best hit rate for normally distributed data:

chart

Dev

deno test
deno run --allow-all build_npm.ts 1.0.0