Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save myd7349/e446dd5045cf7c535c0e51fd3842c5fc to your computer and use it in GitHub Desktop.

Select an option

Save myd7349/e446dd5045cf7c535c0e51fd3842c5fc to your computer and use it in GitHub Desktop.
A detailed guide to using Python's functools.lru_cache for efficient function caching. Covers basic usage, cache management, custom cache control, and additional insights for optimal utilization in various development scenarios. Ideal for Python developers looking to enhance performance with caching techniques.

1703673950911

Python lru_cache Usage Guide

Overview

The lru_cache decorator in Python's functools module implements a caching strategy known as Least Recently Used (LRU). This strategy helps in optimizing the performance of functions by memorizing the results of expensive function calls and returning the cached result when the same inputs occur again.

Basic Usage

from functools import lru_cache

@lru_cache(maxsize=128)  # Caches up to 128 most recent calls
def some_expensive_function(arg):
    # ... expensive computation ...
    return result

Note: If maxsize is not specified, the default cache size is 128. Setting maxsize to None allows for an unlimited cache size.

Using the typed Parameter

The typed parameter, when set to True, enables caching of function calls with distinct argument types separately. By default, typed is set to False, meaning that arguments of different types but equal value are treated as the same for caching purposes.

Example with typed=False (Default)

@lru_cache(maxsize=128, typed=False)
def multiply_by_two(value):
    return value * 2

# An integer and a float with the same value are treated as the same call:
multiply_by_two(5)    # Cache miss, function is executed
multiply_by_two(5.0)  # Cache hit, returns cached result from the first call

Example with typed=True

@lru_cache(maxsize=128, typed=True)
def multiply_by_two(value):
    return value * 2

# An integer and a float with the same value are treated as distinct calls:
multiply_by_two(5)    # Cache miss, function is executed
multiply_by_two(5.0)  # Cache miss, function is executed again

Clearing the Cache

Reset the cache with:

some_expensive_function.cache_clear()

Custom Cache Control

For runtime caching decisions:

@lru_cache(maxsize=None)
def _internal_function(arg):
    # ... expensive computation ...
    return result

def some_expensive_function(arg, use_cache=True):
    if not use_cache:
        _internal_function.cache_clear()
    return _internal_function(arg)

Methods

cache_info()

Provides cache statistics (hits, misses, maxsize, currsize):

info = some_expensive_function.cache_info()
print(info) # Example Output: CacheInfo(hits=2, misses=1, maxsize=128, currsize=1)

cache_clear()

Clears the cache:

some_expensive_function.cache_clear()

cache_parameters()

(From Python 3.8) Shows cache parameters:

params = some_expensive_function.cache_parameters()
print(params)   # Example Output: {'maxsize': 128, 'typed': False}

Additional Notes

  • Thread-Safety: lru_cache is thread-safe, ensuring safe updates in multi-threaded environments.
  • Argument-Based Caching: Best used with immutable arguments as it caches results based on function arguments.
  • Efficiency: Most efficient for functions with high time complexity and frequent calls with identical arguments.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment