Using Docker Compose To Deploy a Container

Using Docker Compose To Deploy a Container

Lightweight, portable, and self-contained, containers give developers granular control over the development, testing, and deployment of software. Docker is one of the most popular platforms for containerizing your applications.

The basic Docker Engine allows you to wrap your application, with all its dependencies, into a construct called a container, which makes it easier to port to other environments.

But what if you wanted to run multiple containers together within the same application host? Fortunately, the Docker platform has a tool for that: Docker Compose

In this article we’ll walk you through the process of deploying a container using Docker Compose.

Contents: Using Docker Compose

What is Docker Compose?

Docker Compose is a tool for managing multi-container applications. It should be used in situations where you might want to configure and start multiple Docker containers on the same host, saving you the time and effort of having to start each container separately. Docker Compose simplifies this process by allowing you to configure all your application’s services within a single YAML file.

The benefits of Docker Compose

Instead of using one large monolithic Docker container, Docker Compose allows you to split your application into smaller containers that encapsulate different functions or technologies into more manageable chunks. Benefits of multi-container applications include:

  • Granular updates: Got one dependency that requires frequent updates while the rest of your stack is relatively stable? Containerize it and manage its version control without having to rebuild the entire container.
  • Simplified configuration management: Configure all your containers with a single YAML file. This makes it easy to pass code through an efficient continuous integration and deployment (CI/CD) pipeline.

Docker Compose vs. Docker Swarm vs. Kubernetes

What’s the difference between Docker Compose, Docker Swarm, and Kubernetes? It’s a common question newcomers to the container ecosystem are bound to ask, so a little disambiguation is necessary before we continue with this tutorial.

  • Docker Compose is a tool for managing multiple containers on a single application host.
  • Docker Swarm is a tool for managing multiple containers across multiple hosts.
  • Kubernetes is a full-fledged container orchestration tool; it has features that make it easier to manage larger clusters of containers as your applications scale.

A good rule of thumb for deciding which tool is right for your needs is to consider the scale of your containerized application. For smaller apps where you’ll be using a single host to manage all your containers, Docker Compose is sufficient. For larger applications that require multiple hosts, the decision is between Docker Swarm (ease-of-use) and Kubernetes (scalability).

Need an introduction to Docker? Refer to our article on Docker basics.

Prerequisites

Before we can get started deploying multi-container apps with Docker Compose, we need to make sure that you’ve fulfilled a few prerequisites:

  • Install Docker Engine
  • Download Docker Desktop and follow the specific instructions for Mac, Windows, or Linux.
  • Note for the Windows installation, make sure to select WSL 2 back-end over the Hyper-V back-end unless you know what you are doing.
  • Install Docker Compose
  • Docker Compose should come pre-installed with Docker Desktop.
  • Within Docker Desktop, enter the following command: docker-compose --version to verify.

Be sure to enable WSL integration in Docker Desktop:

Prequisites

You should now be ready to get started with our tutorial.

Furthermore, this tutorial will assume you already have basic knowledge of back-end web development and are comfortable working with a terminal. While you don’t need to know Python, Flask, or Redis for this example (you just have to copy and paste), you should have a basic understanding of how to hook a web application up to a web server and view it in your browser.  

How to deploy a multi-container application with Docker Compose

Building a Docker image and deploying it might seem simple enough, but it can quickly get tedious retyping all those terminal commands to manage multiple containers in a microservices app.  Docker Compose gives you a quick way to automate many of the manual commands you would be inputting into a terminal to deploy your application.

The general process for deploying a multi-container application looks something like this:  

  1. Build your application as a collection of independent microservices.
  2. Create a Dockerfile that will build an image for each microservice
  3. Create a docker-compose.yml file
  4. Type docker compose up into the terminal to deploy your multi-container application

Of course in practice, getting those individual Docker images to work together will vary in difficulty depending on the unique quirks of your multi-container app.

In the next section we’ll start with a simple example of how to deploy a multi-container Flask application with Docker Compose.

Multi-container Flask app example with Docker Compose and Redis

In order to demonstrate how to deploy multiple containers using Docker Compose, we will be using a simple Python web application built with the Flask framework using Redis as a database.

The Flask app will populate the Redis database with some starter data on initialization. It will then retrieve this list of products from Redis and use it to populate a products catalog.

Product Catalog

Upon completion, you will have successfully deployed a multi-container Flask application with Docker Compose using Redis as a database. Functioning buttons and the services needed to make a working product catalog are beyond the scope of this tutorial project.  

1. Create a directory for your project

You can do so from the terminal like so:

--CODE language-markup line-numbers--
$ mkdir docker-flask-redis
$ cd docker-flask-redis

2. Set up your Flask app

Create a file called app.py in your project directory and paste this in:

--CODE language-markup line-numbers--
# Create a simple Flask app here -->
from flask import Flask, render_template
import redis
import json

app = Flask(__name__)

# Set up a redis client to host our data →

redis_client = redis.Redis(host='redis', port=6379)

# json.dumps converts our test data from a python dictionary of products into a json to store in our redis database →

redis_client.set('product', json.dumps([
        {'id': 1, 'name': 'Power Drill', 'barcode': '406780655784', 'price': 300},
        {'id': 2, 'name': 'Circular Saw', 'barcode': '522687161043', 'price': 200},
        {'id': 3, 'name': 'Desk', 'barcode': '757543429691', 'price': 150}
        ]))

# Retrieve product json from redis to be used in our application →

items = json.loads(redis_client.get('product'))

# Set up routing as normal... →

@app.route('/')

# Send our python dictionary to index.html via render_template →

def products_page():

    return render_template('index.html', items=items)

Create a templates folder with an index.html file and paste the following code into it:

--CODE language-markup line-numbers--
<!DOCTYPE html>
<html lang="en">

<H1>Product Catalog</H1>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.0.2/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-EVSTQN3/azprG1Anm3QDgpJLIm9Nao0Yz1ztcQTwFspd3yD65VohhpuuCOmLASjC" crossorigin="anonymous"></head>
<table class="table table-hover">
    <thead>
        <tr>
            <th scope="col">ID</th>
            <th scope="col">Name</th>
            <th scope="col">Barcode</th>
            <th scope="col">Price</th>
            <th scope="col">Options</th>
        </tr>
    </thead>
    <tbody>
        <!-- Jinja syntax for iterating over items in table -->
        {% for item in items %}
        <!-- python dictionary populates the rows here: -->
            <tr>
                <td>{{ item.id }}</td>
                <td>{{ item.name }}</td>
                <td>{{ item.barcode }}</td>
                <td>${{ item.price }}</td>
                <td>
                     <!-- Buttons included here for aesthetic purposes only: -->
                    <button class="btn btn-outline btn-info">More Info</button>
                    <button class="btn btn-outline btn-success">Purchase this Item</button>
                </td>
            </tr>
        {% endfor %}
    </tbody>
</table>

</html>

Create another file called requirements.txt in your project directory and paste this in:

--CODE language-markup line-numbers--
flask
redis

While the details of setting up a Python Flask App and connecting it to Redis are beyond the scope of this tutorial, I’ve included comments throughout the code to explain how it works.

3. Create a Dockerfile

If you were to manually run a Flask app, you would have to use your terminal to manually set a working directory, specify your environment variables, and provide other details needed to deploy your app.

The Dockerfile provides a simple shorthand that allows you to simply list all those steps within a single document. Docker will use this information to build your web app image.

Create a Dockerfile to run your application:

--CODE language-markup line-numbers--
FROM python:3.7-alpine
WORKDIR /code
ENV FLASK_APP=app.py
ENV FLASK_RUN_HOST=0.0.0.0
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
EXPOSE 5000
COPY . .
CMD ["flask", "run"]

Here’s what each of those lines in the Dockerfile are doing:

  • FROM python:3.7-alpine tells docker to build an image starting with the Python 3.7 image as a base
  • WORKDIR /code sets the working directory within your container to /code
  • ENV FLASK_APP=app.py sets the environment variable that specifies the main filename of your flask application
  • ENV FLASK_RUN_HOST=0.0.0.0 sets the environment variable that specifies the host you want to bind your app to
  • COPY requirements.txt requirements.txt Copies requirements.txt from your directory to the container directory
  • RUN pip install -r requirements.txt Installs the Python app dependencies
  • EXPOSE 5000 adds metadata to the image to listen on port 5000
  • COPY . . Shorthand for copying the current directory in the project to the working directory in the image  
  • CMD ["flask", "run"] Sets the default command for the container to flask run

4. Define services in a Compose file

Set up a docker-compose.yml file like so:

--CODE language-markup line-numbers--
version: "3.9"
services:
  web:
    build: .
    ports:
      - "8000:5000"
    volumes:
      - .:/code
    environment:
      FLASK_ENV: development
  redis:
    image: "redis:alpine"

The compose file basically tells Docker how our two services will be working together. Each service has its own container image.

The two services composed by our docker-compose.yml file are the:

  • web service that reads the Dockerfile from our current project directory to build a Docker image. It then binds the container and the host machine to port 8000. The Flask web server runs on port 5000.
  • redis service which pulls the public Redis:Alpine image from Docker Hub for use by our application.

Once we’ve specified how our two containers are going to be run together, we are ready to deploy our multi-container application.

5. Build and run the app with Compose

Typing docker compose up in the terminal while Docker Desktop is running in the background will deploy your Python application.

You’ll know it worked if the terminal completes without exiting. It will look something like this:

Vscode Terminal on Docker

You can also open up another terminal and type docker compose ps to list all containers running in docker engine. You should see one service for the redis image and another for the Flask web app you just created:

Verify Docker

Finally, you can simply open your Docker Desktop GUI to see which containers are running on your machine:

Docker compose

6. Navigate to your browser and check out your app

And now for the moment you’ve been waiting for, it’s time to see if our multi-container application behaves as expected.

If you type http://localhost:8000/ into your browser you should be able to see the following app in your browser:

Product Catalog

Congratulations!

You have successfully launched your first multi-container application with docker compose. If you’re already familiar with web development with Flask, you’re off to a great start for building your own microservices application. From this point it’s a matter of refactoring and expanding upon your python application to include the data models and separate product and user services needed to make the product catalog you just created truly interactive and functional.

Put your Docker web development skills to work

Once you get the hang of automating container deployment with Docker Compose, it won’t be long before you realize the true potential this tool has for streamlining your developer workflow.

Any time you find yourself typing the same commands into a terminal, consider browsing the Docker documentation for ways to automate those commands with a docker-compose.yml file.

And if you are already a freelance web developer familiar with building microservices apps, adding Docker Compose to your DevOps arsenal won’t take long. The ability to containerize microservices into multi-container apps is a highly coveted skill.  

Ready to put your Docker Compose skills to work? Apply to one of these freelance Docker jobs on Upwork today!

Heading

asdassdsad
Projects related to this article:
No items found.

Author Spotlight

Using Docker Compose To Deploy a Container
Yoshitaka Shiotsu
Technical Copywriter & SEO Consultant

Yoshitaka Shiotsu is a project engineer turned technical copywriter and SEO consultant who regularly contributes to the Upwork Resource Center. He specializes in helping tech companies, startups, and entrepreneurs set themselves up as voices of authority within their target industries.

Using Docker Compose To Deploy a Container
Technical Copywriter & SEO Consultant

Get This Article as a PDF

For easy printing, reading, and sharing.

Download PDF

Latest articles

X Icon
Hide