» Python: Build a REST API with Flask » 2. Development » 2.9 Cache: Redis

Cache: Redis

Big queries in MySQL or large aggregations in MongoDB may take seconds or even minutes to finish. You definitely don't want to trigger these operations frequently for each user request.

Caching the results in memory is a great way to alleviate this issue. If your API server is running on a single machine or node, simply putting these results in in-memory HashMaps or Dictionaries should solve the problem. If you have multiple machines or nodes running the API server and sharing common memory, Redis is your best choice here.

Try Redis

  1. Install Redis on your machine and start it.

  2. Add redis dependency.

pip3 install redis

Update requirements.txt:

pip3 freeze > requirements.txt
  1. Update code.

Add infrastructure/cache/helper.py:

from abc import ABC, abstractmethod


class CacheHelper(ABC):
    @abstractmethod
    def save(self, key: str, value: str) -> None:
        pass

    @abstractmethod
    def load(self, key: str) -> str:
        pass

Use redis in infrastructure/cache/redis.py:

from typing import Any, Optional

from redis import Redis

from .helper import CacheHelper
from ..config import CacheConfig

DEFAULT_TTL = 3600


class RedisCache(CacheHelper):
    def __init__(self, c: CacheConfig):
        self.client = Redis(
            host=c.host,
            port=c.port,
            password=c.password,
            db=c.db,
        )

    def save(self, key: str, value: str) -> None:
        self.client.set(key, value, ex=DEFAULT_TTL)

    def load(self, key: str) -> Optional[str]:
        value: Any = self.client.get(key)
        if value is None:
            return None
        return value.decode("utf-8")

Add related config items in infrastructure/config/config.py:

@@ -14,6 +14,14 @@ class DBConfig:
     mongo_db_name: str
 
 
+@dataclass
+class CacheConfig:
+    host: str
+    port: int
+    password: str
+    db: int
+
+
 @dataclass
 class ApplicationConfig:
     port: int
@@ -22,6 +30,7 @@ class ApplicationConfig:
 @dataclass
 class Config:
     app: ApplicationConfig
+    cache: CacheConfig
     db: DBConfig
 
 
@@ -30,5 +39,6 @@ def parseConfig(filename: str) -> Config:
         data = yaml.safe_load(f)
         return Config(
             ApplicationConfig(**data['app']),
+            CacheConfig(**data['cache']),
             DBConfig(**data['db'])
         )

Put in new values in config.yml:

@@ -9,3 +9,8 @@ db:
   database: "lr_book"
   mongo_uri: "mongodb://localhost:27017"
   mongo_db_name: "lr_book"
+cache:
+  host: "localhost"
+  port: 6379
+  password: "test_pass"
+  db: 0

Wire in redis connection in application/wire_helper.py:

@@ -1,21 +1,27 @@
 from books.domain.gateway import BookManager, ReviewManager
+from books.infrastructure.cache import RedisCache, CacheHelper
 from ..infrastructure.config import Config
 from ..infrastructure.database import MySQLPersistence, MongoPersistence
 
 
 class WireHelper:
-    def __init__(self, sqlPersistence: MySQLPersistence, noSQLPersistence: MongoPersistence):
+    def __init__(self, sqlPersistence: MySQLPersistence, noSQLPersistence: MongoPersistence, kvStore: RedisCache):
         self.sqlPersistence = sqlPersistence
         self.noSQLPersistence = noSQLPersistence
+        self.kvStore = kvStore
 
     @classmethod
     def new(cls, c: Config):
         db = MySQLPersistence(c.db)
         mdb = MongoPersistence(c.db.mongo_uri, c.db.mongo_db_name)
-        return cls(db, mdb)
+        kv = RedisCache(c.cache)
+        return cls(db, mdb, kv)
 
     def book_manager(self) -> BookManager:
         return self.sqlPersistence
 
     def review_manager(self) -> ReviewManager:
         return self.noSQLPersistence
+
+    def cache_helper(self) -> CacheHelper:
+        return self.kvStore

Suppose listing all books is a resource-intensive query in your database. In such cases, you may choose to store its query result in Redis for faster retrieval in the future.

Changes in application/executor/book_operator.py:

@@ -1,12 +1,19 @@
-from typing import List, Optional
+from dataclasses import asdict
+import json
+from typing import Any, Dict, List, Optional
+
+from books.infrastructure.cache.helper import CacheHelper
 from ...domain.model import Book
 from ...domain.gateway import BookManager
 
+BOOKS_KEY = "lr-books"
+
 
 class BookOperator():
 
-    def __init__(self, book_manager: BookManager):
+    def __init__(self, book_manager: BookManager, cache_helper: CacheHelper):
         self.book_manager = book_manager
+        self.cache_helper = cache_helper
 
     def create_book(self, b: Book) -> Book:
         id = self.book_manager.create_book(b)
@@ -17,7 +24,13 @@ class BookOperator():
         return self.book_manager.get_book(id)
 
     def get_books(self) -> List[Book]:
-        return self.book_manager.get_books()
+        v = self.cache_helper.load(BOOKS_KEY)
+        if v:
+            return json.loads(v)
+        books = self.book_manager.get_books()
+        self.cache_helper.save(
+            BOOKS_KEY, json.dumps([_convert(b) for b in books]))
+        return books
 
     def update_book(self, id: int, b: Book) -> Book:
         self.book_manager.update_book(id, b)
@@ -25,3 +38,10 @@ class BookOperator():
 
     def delete_book(self, id: int) -> None:
         return self.book_manager.delete_book(id)
+
+
+def _convert(b: Book) -> Dict[str, Any]:
+    new_b = asdict(b)
+    new_b['created_at'] = b.created_at.isoformat()
+    new_b['updated_at'] = b.updated_at.isoformat()
+    return new_b

Tune a little bit in adapter/router.py:

@@ -108,7 +108,11 @@ def health():
 
 def make_router(app: Flask, wire_helper: WireHelper):
     rest_handler = RestHandler(
-        app.logger, BookOperator(wire_helper.book_manager()), ReviewOperator(wire_helper.review_manager()))
+        app.logger,
+        BookOperator(
+            wire_helper.book_manager(),
+            wire_helper.cache_helper()),
+        ReviewOperator(wire_helper.review_manager()))
     app.add_url_rule('/', view_func=health)
     app.add_url_rule('/books', view_func=rest_handler.get_books)
     app.add_url_rule('/books/<int:id>', view_func=rest_handler.get_book)

Those are all the changes you need to incorporate redis. Let's now try the endpoint powered by the new cache system.

Try with curl

List all books:

curl -X GET -w "Total time: %{time_total}s\n" http://localhost:5000/books

The result is just as before, but the performance improves a lot here. You can see that from the logs of curl.

Total time: 0.012821s
Total time: 0.008976s
Total time: 0.008859s
Total time: 0.008658s

Use redis-cli to check the values in Redis:

redis-cli

Play with keys in the redis client shell:

127.0.0.1:6379> keys *
1) "lr-books"
127.0.0.1:6379> get lr-books
"[{\"id\":1,\"title\":\"Great Book II\",\"author\":\"Carl Smith\",\"published_at\":\"2022-01-01T08:00:00+08:00\",\"description\":\"Another sample book description\",\"isbn\":\"8334567890\",\"total_pages\":3880,\"created_at\":\"2024-02-25T16:29:31.353+08:00\",\"updated_at\":\"2024-02-25T16:29:31.353+08:00\"}]"
127.0.0.1:6379> del lr-books
(integer) 1

Awesome! Redis is at your service now! 💐

PrevNext