hcn.utils.cache¶
Introduction¶
The cache decorator is designed to speed up expensive computations by storing results on disk. It is content-aware: if you change the function's source code or its input arguments, the cache automatically invalidates and recomputes.
Usage¶
Simply wrap your function with @cache.
The first execution will be slow, and subsequent calls with the same arguments will be near-instant.
WARNING: DO NOT ABUSE !!!¶
Cache is stored on disk at the location specified by CACHE_DIR (~/.hcn.utils.cache.cache_dir by default), and is by default deleted after 2 days of inactivity.
Since the cache is saved on disk, it will persist across kernel or computer restarts. However, this may introduce a risk of computation errors.
Additionally, because the cache is stored on disk, using it might be less efficient than recomputing when the input data or function output is very large.
Thus, caching is generally most effective for time-consuming computations where the input and output data sizes are relatively small.
Examples¶
from hcn.preamble import *
from hcn.utils.cache import cache, clean_all_cache
import time
@cache
def heavy_task(n):
sleep(2) # Simulate expensive computation
return np.random.randn(n)
start = time.time()
res1 = heavy_task(500)
print(f"First run time: {time.time() - start:.4f}s")
First run time: 2.0065s
start = time.time()
res2 = heavy_task(500)
print(f"Second run time: {time.time() - start:.4f}s")
cache hit: reading result from cache file Second run time: 0.0008s
clean_all_cache()
done