» Python: Building Event-Driven Microservices with Kafka » 5. Deployment » 5.2 Docker Compose

Docker Compose

Docker Compose is a tool for defining and running multi-container Docker applications. It allows you to use a YAML file to configure your application's services, networks, and volumes, and then spin up all the containers required to run your application with a single command.

Note:
The easiest and recommended way to get Docker Compose is to install Docker Desktop. Docker Desktop includes Docker Compose along with Docker Engine and Docker CLI which are Compose prerequisites.

Install Compose if needed: https://docs.docker.com/compose/install/

Add compose/docker-compose.yml:

services:
  lr-event-books-web-py:
    build:
      context: ../
      dockerfile: service/web/Dockerfile
    ports:
      - 8000:8000
    volumes:
      - ./config-web.yml:/app/service/web/config.yml
    depends_on:
      mysql:
        condition: service_healthy
      kafka:
        condition: service_healthy
  lr-event-books-trend-py:
    build:
      context: ../
      dockerfile: service/trend/Dockerfile
    ports:
      - 8001:8001
    volumes:
      - ./config-trend.yml:/app/service/trend/config.yml
    depends_on:
      redis:
        condition: service_started
      kafka:
        condition: service_healthy
  lr-event-books-rec-py:
    build:
      context: ../
      dockerfile: service/recommendation/Dockerfile
    ports:
      - 8002:8002
    volumes:
      - ./config-rec.yml:/app/service/recommendation/config.yml
    depends_on:
      mongo:
        condition: service_started
      kafka:
        condition: service_healthy
  redis:
    image: docker.io/bitnami/redis:7.0
    environment:
      - REDIS_PASSWORD=${REDIS_PASSWORD}
    ports:
      - 6379:6379
  mysql:
    image: docker.io/bitnami/mysql:5.7.43
    environment:
      - MYSQL_DATABASE=lr_book
      - MYSQL_USER=test_user
      - MYSQL_PASSWORD=${MYSQL_PASSWORD}
      - MYSQL_ROOT_PASSWORD=${MYSQL_ROOT_PASSWORD}
    ports:
      - 3306:3306
    healthcheck:
      test:
        [
          "CMD",
          "mysqladmin",
          "ping",
          "-h",
          "localhost",
          "-u",
          "root",
          "-p$MYSQL_ROOT_PASSWORD",
        ]
      timeout: 20s
      retries: 10
    volumes:
      - ~/lr-mysql-data:/bitnami/mysql/data
  mongo:
    image: bitnami/mongodb:latest
    environment:
      - ALLOW_EMPTY_PASSWORD=yes
    ports:
      - 27017:27017
    volumes:
      - ~/lr-mongodb-data:/bitnami/mongodb
  kafka:
    image: bitnami/kafka:latest
    environment:
      - KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true
      - KAFKA_CFG_NODE_ID=0
      - KAFKA_CFG_PROCESS_ROLES=controller,broker
      - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,CONTROLLER://:9093
      - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092
      - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,EXTERNAL:PLAINTEXT,PLAINTEXT:PLAINTEXT
      - KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=0@kafka:9093
      - KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
    ports:
      - 9092:9092
    healthcheck:
      test:
        [
          "CMD-SHELL",
          "kafka-topics.sh --list --bootstrap-server localhost:9092",
        ]
      interval: 10s
      timeout: 10s
      retries: 3

It‘s not easy to find a working yet not annoying healthcheck test for MySQL and Kafka in Docker Compose.

Fix Topic Issues

In order to get rid of errors like “cimpl.KafkaException: KafkaError{code=UNKNOWN_TOPIC_OR_PART,val=3,str="Subscribed topic not available: lr-book-searches: Broker: Unknown topic or partition"}“.

You may need to create a topic before subscribing it.

Changes in service/infrastructure/mq/kafka_consumer.py:

@@ -1,7 +1,9 @@
 import sys
+import time
 from typing import List
 
-from confluent_kafka import Consumer, KafkaException
+from confluent_kafka import Consumer, KafkaException, KafkaError
+from confluent_kafka.admin import AdminClient, NewTopic
 
 from ...domain.gateway import TrendEventConsumer, ConsumeCallback
 
@@ -12,10 +14,20 @@ class KafkaConsumer(TrendEventConsumer):
             {'bootstrap.servers': ','.join(brokers),
              'group.id': group_id,
              'auto.offset.reset': 'smallest'})
+        self.brokers = brokers
         self.topic = topic
         self.running = False
 
+    def _try_to_create_topic(self, brokers: List[str], topic: str):
+        admin_client = AdminClient({'bootstrap.servers': ','.join(brokers)})
+        topic_metadata = admin_client.list_topics(timeout=10)
+        if topic_metadata.topics.get(topic) is None:
+            new_topic = NewTopic(topic, 1, 1)
+            admin_client.create_topics([new_topic])
+            time.sleep(1)  # Hack: wait for it
+
     def consume_events(self, callback: ConsumeCallback):
+        self._try_to_create_topic(self.brokers, self.topic)
         try:
             self.consumer.subscribe([self.topic])
             self.running = True
@@ -24,7 +36,7 @@ class KafkaConsumer(TrendEventConsumer):
                 if msg is None:
                     continue
                 if msg.error():
-                    if msg.error().code() == KafkaException._PARTITION_EOF:
+                    if msg.error().code() == KafkaError._PARTITION_EOF:
                         # End of partition
                         sys.stderr.write('%% {} [{}] reached end at offset {} - {}\n'.format(
                             msg.topic(), msg.partition(), msg.offset()))

Polish Code

Changes in service/web/infrastructure/config/config.py:

@@ -20,7 +20,6 @@ class MQConfig:
 
 @dataclass
 class ApplicationConfig:
-    port: int
     page_size: int
     templates_dir: str

Add Config Files for Docker Compose

Add compose/config-web.yml:

app:
  page_size: 5
  templates_dir: "service/web/adapter/templates/"
db:
  host: mysql
  port: 3306
  user: test_user
  password: test_pass
  database: lr_event_book
mq:
  brokers:
    - kafka:9092
  topic: lr-book-searches
remote:
  trend_url: "http://lr-event-books-trend-py:8001/trends"
  rec_url: "http://lr-event-books-rec-py:8002/recommendations?uid="

You need a special version of the config file for Docker Compose because all the hosts, paths, and urls vary in different hosting environments.

Add compose/config-trend.yml:

cache:
  host: redis
  port: 6379
  password: test_pass
  db: 0
mq:
  brokers:
    - kafka:9092
  topic: lr-book-searches
  group_id: trend-svr

Add compose/config-rec.yml:

app:
  page_size: 10
db:
  mongo_uri: "mongodb://mongo:27017"
  mongo_db_name: lr_event_rec
mq:
  brokers:
    - kafka:9092
  topic: lr-book-searches
  group_id: rec-svr

Add compose/.env:

REDIS_PASSWORD=test_pass
MYSQL_PASSWORD=test_pass
MYSQL_ROOT_PASSWORD=test_root_pass

Caution: .env files should be ignored in .gitignore.

Changes in .gitignore:

@@ -160,3 +160,4 @@ cython_debug/
 #.idea/
 
 lrFastAPIEnv/
+.env

Run it:

cd compose
docker compose up

You should see something like this:

[+] Running 7/7
 ✔ Container compose-redis-1                    Created                                                                                                                                                                               0.0s 
 ✔ Container compose-mongo-1                    Recreated                                                                                                                                                                             0.2s 
 ✔ Container compose-kafka-1                    Recreated                                                                                                                                                                             0.1s 
 ✔ Container compose-mysql-1                    Recreated                                                                                                                                                                             0.2s 
 ✔ Container compose-lr-event-books-trend-py-1  Created                                                                                                                                                                               0.1s 
 ✔ Container compose-lr-event-books-rec-py-1    Created                                                                                                                                                                               0.1s 
 ✔ Container compose-lr-event-books-web-py-1    Recreated                                                                                                                                                                                   0.1s 
Attaching to kafka-1, lr-event-books-rec-1, lr-event-books-trend-1, lr-event-books-web-1, mongo-1, mysql-1, redis-1
kafka-1          | kafka 07:58:07.37 INFO  ==> 
kafka-1          | kafka 07:58:07.37 INFO  ==> Welcome to the Bitnami kafka container
...

redis-1          | redis 13:24:52.38 
redis-1          | redis 13:24:52.39 Welcome to the Bitnami redis container
...

mongo-1          | mongodb 13:24:52.60 INFO  ==> 
mongo-1          | mongodb 13:24:52.60 INFO  ==> Welcome to the Bitnami mongodb container
mongo-1          | mongodb 13:24:52.61 INFO  ==> ** Starting MongoDB setup **
...
mysql-1          | mysql 13:24:52.61 
mysql-1          | mysql 13:24:52.62 Welcome to the Bitnami mysql container
mysql-1          | mysql 13:24:52.63 INFO  ==> ** Starting MySQL setup **
...

You don't need to manually install or setup those databases and message queues anymore. They're all in good hands with docker compose.
Anyway, you need to make sure that you have the MySQL database lr_event_book in your mysql docker container.

CREATE DATABASE lr_event_book CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;

Put in some test books if there isn‘t one.

curl -X POST -H "Content-Type: application/json" -d '{"title": "To Kill a Mockingbird", "author": "Harper Lee", "published_at": "1960-07-11", "description": "A novel set in the American South during the 1930s, dealing with themes of racial injustice and moral growth."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "1984", "author": "George Orwell", "published_at": "1949-06-08", "description": "A dystopian novel depicting a totalitarian regime, surveillance, and propaganda."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "Pride and Prejudice", "author": "Jane Austen", "published_at": "1813-01-28", "description": "A classic novel exploring the themes of love, reputation, and social class in Georgian England."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "The Catcher in the Rye", "author": "J.D. Salinger", "published_at": "1951-07-16", "description": "A novel narrated by a disaffected teenager, exploring themes of alienation and identity."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "The Lord of the Rings", "author": "J.R.R. Tolkien", "published_at": "1954-07-29", "description": "A high fantasy epic following the quest to destroy the One Ring and defeat the Dark Lord Sauron."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "Moby-Dick", "author": "Herman Melville", "published_at": "1851-10-18", "description": "A novel exploring themes of obsession, revenge, and the nature of good and evil."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "The Hobbit", "author": "J.R.R. Tolkien", "published_at": "1937-09-21", "description": "A fantasy novel set in Middle-earth, following the adventure of Bilbo Baggins and the quest for treasure."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "The Adventures of Huckleberry Finn", "author": "Mark Twain", "published_at": "1884-12-10", "description": "A novel depicting the journey of a young boy and an escaped slave along the Mississippi River."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "War and Peace", "author": "Leo Tolstoy", "published_at": "1869-01-01", "description": "A novel depicting the Napoleonic era in Russia, exploring themes of love, war, and historical determinism."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "Alice’s Adventures in Wonderland", "author": "Lewis Carroll", "published_at": "1865-11-26", "description": "A children’s novel featuring a young girl named Alice who falls into a fantastical world populated by peculiar creatures."}' http://localhost:8000/api/books
curl -X POST -H "Content-Type: application/json" -d '{"title": "The Odyssey", "author": "Homer", "published_at": "8th Century BC", "description": "An ancient Greek epic poem attributed to Homer, detailing the journey of Odysseus after the Trojan War."}' http://localhost:8000/api/books

Now, if you visit your page at http://localhost:8000/, you should be able to see all the features from these 3 microservices.

Search with terms like “love“, “peace“ and “Odyssey“, then refresh the page and see what happens.

Your event-driven microservices work like a charm! 📢

PrevNext