Hands-On Docker for Microservices with Python
上QQ阅读APP看书,第一时间看更新

Configuring your service

We can configure the service to use environment variables to change the behavior. For containers, this is a fantastic alternative to using configuration files, as it allows immutable containers that get their configuration injected. This is in line with the Twelve-Factor App (https://12factor.net/config) principles and allows for good separation between code and configuration, and the setting up of the different deploys that the code might be used for.

One of the advantages that we'll look at later with the use of Kubernetes is creating new environments on-demand, which can be tweaked for testing purposes or tailored for development or demo. Being able to quickly change all the configuration by injecting the proper environment makes this operation very easy and straightforward. It also allows you to enable or disable features, if properly configured, which helps the enablement of features on launch day, with no code rollout.

This allows the configuration of the database to connect to, so we can choose between the SQLite backend or PostgreSQL. 

Configuring the system is not limited to open variables, though. Environment variables will be used later in the book for storing secrets. Note that a secret needs to be available inside the container.

We will configure the tests to access our newly created database container. To do that, we first need the ability to choose between either SQLite or PostgreSQL through configuration. Check out the ./ThoughtsBackend/thoughts_backend/db.py file:

import os
from pathlib import Path
from flask_sqlalchemy import SQLAlchemy

DATABASE_ENGINE = os.environ.get('DATABASE_ENGINE', 'SQLITE')

if DATABASE_ENGINE == 'SQLITE':
dir_path = Path(os.path.dirname(os.path.realpath(__file__)))
path = dir_path / '..'

# Database initialisation
FILE_PATH = f'{path}/db.sqlite3'
DB_URI = 'sqlite+pysqlite:///{file_path}'
db_config = {
'SQLALCHEMY_DATABASE_URI': DB_URI.format(file_path=FILE_PATH),
'SQLALCHEMY_TRACK_MODIFICATIONS': False,
}

elif DATABASE_ENGINE == 'POSTGRESQL':
db_params = {
'host': os.environ['POSTGRES_HOST'],
'database': os.environ['POSTGRES_DB'],
'user': os.environ['POSTGRES_USER'],
'pwd': os.environ['POSTGRES_PASSWORD'],
'port': os.environ['POSTGRES_PORT'],
}
DB_URI = 'postgresql://{user}:{pwd}@{host}:{port}/{database}'
db_config = {
'SQLALCHEMY_DATABASE_URI': DB_URI.format(**db_params),
'SQLALCHEMY_TRACK_MODIFICATIONS': False,
}

else:
raise Exception('Incorrect DATABASE_ENGINE')

db = SQLAlchemy()

When using the DATABASE_ENGINE environment variable set to POSTGRESQL, it will configure it properly. Other environment variables will need to be correct; that is, if the database engine is set to PostgreSQL, the POSTGRES_HOST variable needs to be set up.

Environment variables can be stored individually in the docker-compose.yaml file, but it's more convenient to store multiple ones in a file. Let's take a look at environment.env:

DATABASE_ENGINE=POSTGRESQL
POSTGRES_DB=thoughts
POSTGRES_USER=postgres
POSTGRES_PASSWORD=somepassword
POSTGRES_PORT=5432
POSTGRES_HOST=db

Note that the definition of users, and so on is in line with the arguments to create Dockerfile for testing. POSTGRES_HOST is defined as db, which is the name of the service.

Inside the Docker cluster created for docker-compose, you can refer to services by their names. This will be directed by the internal DNS to the proper container, as a shortcut. This allows easy communication between services, as they can configure their access very easily by name. Note that this connection is only valid inside the cluster, for communication between containers.

Our testing service using the PostgreSQL container then gets defined as follows:

    test-postgresql:
env_file: environment.env
environment:
- PYTHONDONTWRITEBYTECODE=1
build:
dockerfile: docker/app/Dockerfile
context: .
entrypoint: pytest
depends_on:
- db
volumes:
- ./ThoughtsBackend:/opt/code

This is very similar to the test-sqlite service, but it adds the environment configuration in environment.env and adds a dependency on db. This means that docker-compose will start the db service, if not present.

You can now run the tests against the PostgreSQL database:

$ docker-compose run test-postgresql
Starting ch3_db_1 ... done
============== test session starts ====================
platform linux -- Python 3.6.8, pytest-4.6.0, py-1.8.0, pluggy-0.12.0 -- /opt/venv/bin/python3
cachedir: .pytest_cache
rootdir: /opt/code, inifile: pytest.ini
plugins: flask-0.14.0
collected 17 items

tests/test_thoughts.py::test_create_me_thought PASSED [ 5%]
...
tests/test_token_validation.py::test_valid_token_header PASSED [100%]

===== 17 passed, 177 warnings in 2.14 seconds ===
$

This environment file will be useful for any service that needs to connect to the database, such as deploying the service locally.