top of page

Multi-Container AI System with Docker Compose and Best Practices

  • 6 days ago
  • 7 min read



Course: Docker for AI Apps

Level: Medium to Advanced

Type: Individual

Duration: 7 to 10 days





Objective

This assignment tests your ability to design and operate a multi-container Docker system for an AI application. You will configure container-to-container networking using a user-defined bridge network, orchestrate a multi-service stack with Docker Compose, build and containerize a FastAPI AI REST API with session management and health checks, apply Docker best practices including multi-stage builds and non-root users, and publish your final image to Docker Hub.





Problem Statement

You are required to build a Dockerized AI REST API using FastAPI and the OpenAI API, backed by a Redis container for session-based conversation history. The services must communicate over a user-defined Docker bridge network, be orchestrated by a Docker Compose file, and the final API image must follow all production-level best practices covered in Chapter 10. The completed image must be pushed to Docker Hub.





Tasks




Task 1: Docker Networking (15 marks)


  • Create a user-defined bridge network called ai-network using docker network create.

  • Start a Redis container on ai-network with the name redis. Start a second container (any image with curl installed, such as python:3.11-slim) also on ai-network.

  • From inside the second container, use curl to ping the Redis container by name (for example, redis:6379). Show that DNS-based service discovery works on user-defined networks.

  • Run the same test using the default bridge network to demonstrate that container name DNS resolution does not work there. Include the output of both tests in your submission.

  • Use docker network inspect ai-network to show both containers listed as connected. Include the output.





Task 2: Docker Compose Stack (20 marks)


  • Write a docker-compose.yml file that defines two services: api (your FastAPI application, built from a local Dockerfile) and redis (using the official redis:7-alpine image).

  • Both services must be connected to a named network called ai-network defined in the Compose file. The api service must declare a dependency on redis using depends_on.

  • Pass the OpenAI API key to the api service using env_file: .env. Do not hardcode any secrets in the Compose file.

  • Configure a named volume for Redis data persistence (redis-data mounted to /data). Set the redis service restart policy to unless-stopped.

  • Use docker compose up -d to start the stack, then use docker compose ps and docker compose logs api to verify both services are running. Include the output.

  • Demonstrate docker compose down, then docker compose up -d again, and confirm Redis data persists across restarts.




Task 3: Dockerize an AI REST API (30 marks)


  • Build a FastAPI application (main.py) with the following endpoints: POST /chat (accepts session_id and message, returns AI response), GET /health (returns service status), and GET /history/{session_id} (returns full conversation history for a session).

  • Use Redis to store conversation history per session_id. Each session key must expire after 3600 seconds. Use the REDIS_HOST environment variable to connect to Redis so the host can be changed without rebuilding the image.

  • Add a HEALTHCHECK instruction to your Dockerfile that calls GET /health every 30 seconds with a 10-second timeout and 3 retries.

  • Raise an appropriate HTTPException if the OpenAI API call fails. Return HTTP 503 with a clear error message.

  • Test all three endpoints using curl or a browser. Include the curl commands and responses in your submission. Demonstrate at least two separate sessions running concurrently with independent conversation histories.




Task 4: Apply Docker Best Practices (20 marks)


  • Rewrite your Dockerfile as a multi-stage build. Use a builder stage to install dependencies and a final stage that copies only the installed packages and application code. The final image must not contain pip, build tools, or any intermediate build artifacts.

  • Create a non-root user in the Dockerfile (for example, appuser). Copy application files with the correct ownership using COPY --chown and switch to that user with USER before the CMD instruction.

  • Optimise layer caching by copying requirements.txt and running pip install before copying the application source code. Verify this by making a small change to main.py and rebuilding — only the final COPY and CMD layers should be invalidated.

  • Run docker scout quickview on your final image and document the vulnerability summary. If any critical or high CVEs are reported, explain what they relate to (you do not need to fix them, only identify them).

  • Compare the final image size against a single-stage equivalent. Record both sizes and explain what the multi-stage build removed.




Task 5: Push to Docker Hub and Document (15 marks)


  • Log in to Docker Hub using docker login. Tag your final image using the format yourdockerhubusername/ai-chat-api:1.0.0 following semantic versioning.

  • Push the image to Docker Hub using docker push. Confirm the push succeeded by pulling the image on a different terminal session (or after docker rmi) and running it.

  • Write a README.md that includes: a description of the application, prerequisites (Docker, Docker Compose, an OpenAI API key), step-by-step instructions to run the stack using docker compose up, the Docker Hub image URL, and the purpose of each environment variable.

  • Tag a second version of the image as yourdockerhubusername/ai-chat-api:latest and push it. Explain in 50 to 80 words why maintaining both a versioned tag and a latest tag is a production best practice.





Evaluation Rubric

Criteria

Marks

Docker Networking

15

Docker Compose Stack

20

Dockerized AI REST API

30

Docker Best Practices

20

Docker Hub Push and Documentation

15

Total

100





Deliverables


  • Project directory containing: main.py, Dockerfile, docker-compose.yml, requirements.txt, .dockerignore, .env.example, and README.md.

  • Terminal output or screenshots showing: network DNS test, docker compose up/down/up with volume persistence, all three API endpoints working, multi-stage build layer comparison, and docker scout output.

  • Docker Hub image URL (public repository) for yourdockerhubusername/ai-chat-api.

  • A completed README.md with setup instructions, environment variable descriptions, and the Docker Hub link.




Submission Guidelines


Submit your work via the course LMS (for example, Moodle or Google Classroom).


File Naming Convention: <YourName>_Docker_Assignment2.zip


Inside the ZIP: 


  • main.py

  • Dockerfile

  • docker-compose.yml

  • requirements.txt

  • .dockerignore

  • .env.example

  • README.md

  • screenshots/ (folder with terminal output images)


Deadline: 7 days from the date of release.




Late Submission Policy


  • Up to 24 hours late: 10% penalty applied to the final mark.

  • 24 to 48 hours late: 20% penalty applied to the final mark.

  • Beyond 48 hours: submission will not be accepted.




Important Instructions


  • Never include your real OpenAI API key in any submitted file. Use .env.example with a placeholder value.

  • The multi-stage Dockerfile must produce a meaningfully smaller image than a single-stage build. A size difference of less than 5 MB suggests the builder stage was not correctly separated.

  • The non-root user must be set before the CMD instruction. Running as root inside a container is a security anti-pattern and will result in a deduction on Task 4.

  • Your Docker Hub repository must be public so the submission can be verified by pulling the image.

  • Plagiarism of any kind will result in disqualification from the assignment.




Guidance and Tips


  • Use depends_on in your Compose file, but do not rely on it alone for readiness. Redis starts faster than FastAPI in most cases, but add a retry loop in your application code to handle the case where Redis is not yet ready when the API starts.

  • Test your HEALTHCHECK by running docker inspect <container> and checking the Health.Status field. It should transition from starting to healthy within 90 seconds.

  • When writing the multi-stage Dockerfile, install packages into a virtual environment in the builder stage so you can COPY the entire venv into the final stage cleanly.

  • Do not use COPY . . as the first instruction in your Dockerfile. Always copy requirements.txt first, run pip install, then copy the rest of the source code.

  • Think about what happens when a user sends a message with a session_id that has expired in Redis. Handle this case gracefully rather than raising an unhandled exception.




Bonus (Optional — up to +10 Marks)


  • Add a DELETE /history/{session_id} endpoint that clears the conversation history for a session and returns a confirmation message.

  • Add an Nginx service to your Docker Compose stack as a reverse proxy in front of the FastAPI service, mirroring the architecture from Chapter 11.

  • Set up a GitHub Actions workflow that builds and pushes your Docker image to Docker Hub automatically on every push to the main branch.





Instructor Note


This assignment is designed to simulate the kind of multi-service AI infrastructure work you will encounter in real projects. There is no single correct Compose file or API design. What matters is that your services communicate correctly, your secrets are handled safely, your image follows production best practices, and you can clearly explain every decision you made. A well-documented project with clear reasoning is always more valuable than a working project with no explanation.





Call to Action

Ready to transform your business with AI-powered intelligence that accelerates insights, enhances decision-making, and unlocks the full value of your data?


Codersarts is here to help you turn complex data workflows into efficient, scalable, and evidence-driven AI systems that empower teams to make smarter, faster, and more confident decisions.


Whether you’re a startup looking to build AI-driven products, an enterprise aiming to optimize operations through data science, or a research organization advancing innovation with intelligent data solutions, we bring the expertise and experience needed to design, develop, and deploy impactful AI systems that drive measurable business outcomes.




Get Started Today



Schedule an AI & Data Science Consultation:

Book a 30-minute discovery call with our AI strategists and data science experts to discuss your challenges, identify high-impact opportunities, and explore how intelligent AI solutions can transform your workflows and performance.




Request a Custom AI Demo:

Experience AI in action with a personalized demonstration built around your business use cases, datasets, operational environment, and decision workflows — showcasing practical value and real-world impact.









Transform your organization from data accumulation to intelligent decision enablement — accelerating insight generation, improving operational efficiency, and strengthening competitive advantage.


Partner with Codersarts to build scalable AI solutions including RAG systems, predictive analytics platforms, intelligent automation tools, recommendation engines, and custom machine learning models that empower your teams to deliver exceptional results.


Contact us today and take the first step toward next-generation AI and data science capabilities that grow with your business ambitions.




Comments


bottom of page