Deploying a Monorepo Application via Docker Containers on an EC2 Server with CI/CD using GitHub Actions
Introduction
Deploying a monorepo application using Docker containers on an EC2 instance and setting up CI/CD with GitHub Actions can streamline your development and deployment workflow. In this blog, we'll walk through the entire process, from setting up an EC2 instance, containerizing the application, and deploying it using Docker, to automating the CI/CD pipeline with GitHub Actions.
What is a Monorepo?
A monorepo (monolithic repository) is a single repository that contains multiple services or applications within the same codebase. Unlike polyrepos, where each service has its own repository, a monorepo allows for better code sharing, simplified dependency management, and a unified CI/CD pipeline.
Benefits of a Monorepo
Easier Code Sharing: Shared libraries and utilities across multiple services can be managed more effectively.
Simplified Dependency Management: Ensures all services use compatible versions of dependencies.
Unified CI/CD Pipelines: One pipeline can manage multiple services, improving consistency.
Atomic Changes: Changes across multiple services can be committed and deployed together.
Step 1: Setting Up an EC2 Instance
1.1 Launch an EC2 Instance
Go to the AWS Console and navigate to the EC2 Dashboard.
Click on Launch Instance.
Choose an appropriate Amazon Machine Image (AMI) like Ubuntu 22.04 or Amazon Linux 2.
Select an instance type (e.g.,
t2.medium
for moderate workloads).Configure storage and networking as per your needs.
Under security groups, allow ports
22
(for SSH) and80/443
(for HTTP/HTTPS access).Generate and download a key pair for SSH access.
Click Launch.
1.2 Connect to the Instance
Once the instance is running, connect to it via SSH:
ssh -i your-key.pem ubuntu@your-ec2-public-ip
1.3 Install Docker and Docker Compose
Run the following commands to install Docker and Docker Compose:
sudo apt update && sudo apt install -y docker.io
sudo systemctl start docker
sudo systemctl enable docker
sudo usermod -aG docker $USER
Then, install Docker Compose:
sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
Verify the installation:
docker --version
docker-compose --version
Step 2: Creating a Simple Monorepo Application
2.1 Initialize a Monorepo
Let's create a simple monorepo structure with a backend
(Node.js) and a frontend
(React):
mkdir monorepo-app && cd monorepo-app
git init
mkdir backend frontend
echo "node_modules/" > .gitignore
Backend (Express App)
cd backend
npm init -y
npm install express
Create backend/server.js
:
const express = require('express');
const app = express();
app.get('/', (req, res) => res.send('Backend Running'));
app.listen(5000, () => console.log('Server running on port 5000'));
Frontend (React App)
cd ../frontend
npx create-react-app my-app
cd my-app
npm start
Now, our monorepo has two services under a single repository.
Step 3: Containerizing the Monorepo Application
3.1 Define Dockerfiles for Each Service
Backend (Node.js Express App)
Create backend/Dockerfile
:
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 5000
CMD ["node", "server.js"]
Frontend (React App)
Create frontend/Dockerfile
:
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
CMD ["npm", "start"]
EXPOSE 3000
3.2 Define Docker Compose File
Create a docker-compose.yml
file at the root of your monorepo:
version: '3.8'
services:
backend:
build: ./backend
ports:
- "5000:5000"
depends_on:
- database
frontend:
build: ./frontend
ports:
- "80:3000"
depends_on:
- backend
database:
image: postgres:latest
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: mydb
3.3 Test Locally
Run the following command to build and start the containers:
docker-compose up --build
Step 4: Setting Up CI/CD with GitHub Actions
4.1 Create a GitHub Actions Workflow
Inside your monorepo, create a .github/workflows/deploy.yml
file:
name: Deploy to EC2
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout Repository
uses: actions/checkout@v3
- name: Set up SSH
run: |
echo "${{ secrets.SSH_PRIVATE_KEY }}" > key.pem
chmod 600 key.pem
- name: Deploy to EC2
run: |
ssh -i key.pem -o StrictHostKeyChecking=no ubuntu@your-ec2-public-ip << 'EOF'
cd /home/ubuntu/app
git pull origin main
docker-compose down
docker-compose up --build -d
EOF
4.2 Configure GitHub Secrets
Go to Settings > Secrets and Variables > Actions in your GitHub repo.
Add a new secret named
SSH_PRIVATE_KEY
and paste your private key (your-key.pem
).Add another secret for
EC2_HOST
with your EC2 public IP.
Step 5: Running the Deployment
Once everything is set up, pushing code to the main
branch will trigger the GitHub Actions workflow, which will:
SSH into the EC2 instance.
Pull the latest changes.
Restart the Docker containers with the latest code.
You can check workflow runs in GitHub Actions > Workflows.
Conclusion
In this guide, we walked through deploying a monorepo-based application using Docker containers on an EC2 instance and automating the deployment process using GitHub Actions. This setup ensures that your services remain up-to-date and minimizes manual intervention.
For production, you may want to integrate a reverse proxy (e.g., Nginx) and use environment variables to manage secrets securely.
Now, go ahead and deploy your monorepo with confidence!