In today’s fast-paced development landscape, seamless application deployment is critical to improving productivity and ensuring consistency across environments. AWS EC2, combined with Docker and GitHub automation for Python developers, presents an efficient and scalable solution for deploying Python projects. This guide will walk you through deploying Python applications on AWS EC2, setting up Docker, and leveraging GitHub Actions for automated deployment.
Introduction to Deploying Python Projects on AWS EC2
Amazon EC2 is a popular choice for hosting applications due to its flexibility, scalability, and cost-effectiveness. It allows you to run applications on virtual servers in the cloud, making it an ideal choice for deploying Python-based projects. Docker can simplify the deployment process by containerizing the Python application, ensuring that it runs identically in any environment.
Additionally, GitHub Actions can automate the deployment process by triggering updates whenever code is pushed to a repository, reducing manual effort and minimizing errors.
Setting Up an EC2 Instance for Dockerized Applications
The first step before deploying your Python project is to set up an EC2 instance. Follow these steps to launch your instance:
- Log into AWS Console: Go to the EC2 dashboard and click “Launch Instance.”
- Select an AMI: Choose an Amazon Machine Image (AMI). The Amazon Linux 2 or Ubuntu AMI is a good choice for most use cases.
- Choose Instance Type: Depending on your application’s requirements, select an instance type. The t2.micro is suitable for small projects and is eligible for the free tier.
- Configure Security Groups: Set up security groups to allow access via SSH and any other ports you’ll need (e.g., port 80 for HTTP traffic).
- Launch Instance: Once all configurations are in place, launch the instance and connect via SSH.
Connecting EC2 to GitHub for Automated Deployment
You’ll need to connect your EC2 instance to your GitHub repository to automate the deployment process. This can be done by generating SSH keys and adding the public key to GitHub:
- Generate SSH Keys:
Run the following command on your EC2 instance:
ssh-keygen -t rsa -b 4096 -C “your_email@example.com”
Copy the public key using:
cat ~/.ssh/id_rsa.pub
Add the public key to your GitHub account under Settings > SSH and GPG keys.
- Clone GitHub Repository:
On the EC2 instance, clone your Python project repository:
git clone git@github.com:yourusername/yourrepository.git
Installing Docker on EC2 for Containerization
With your EC2 instance and GitHub repository ready, the next step is to install Docker to containerize your Python application.
- Update the Package Manager:
sudo yum update -y - Install Docker:
For Amazon Linux 2:
sudo amazon-linux-extras install docker
For Ubuntu:
sudo apt update
sudo apt install docker.io
- Start Docker and Enable on Boot:
sudo service docker start
sudo systemctl enable docker
- Add Your User to the Docker Group:
sudo usermod -aG docker ec2-user
Now, you’re ready to use Docker to run your Python application inside a container.
Cloning and Pulling GitHub Repositories to EC2
Once Docker is set up on your EC2 instance, the next step is to pull the latest changes from your GitHub repository. This can be done manually or automatically using GitHub Actions.
- Pull Repository Changes:
Navigate to your project folder:
cd yourrepository
Pull the latest changes:
git pull origin main
Running Docker Containers for Live Deployment
Now that the project is on your EC2 instance and Docker is installed, you can run your Python application inside a Docker container.
- Create a Dockerfile in your project’s root directory if you haven’t already:
FROM python:3.8-slim
WORKDIR /app
COPY . .
RUN pip install -r requirements.txt
CMD [“python”, “app.py”]
- Build and Run the Docker Container:
Build the Docker image:
docker build -t my-python-app .
Run the container:
docker run -d -p 80:80 my-python-app
Your Python application should now be running on your EC2 instance!
Automating Deployment with GitHub Actions for Efficiency
Manual deployments can be cumbersome and error-prone. By using GitHub Actions, you can automate the process so that every time you push changes to the main branch, your EC2 instance automatically pulls the changes and redeploys the updated Docker container.
Create a .github/workflows/deploy.yml file in your GitHub repository:
name: Deploy to EC2
on:
push:
branches:
– main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
– name: Checkout Code
uses: actions/checkout@v2
– name: Deploy to EC2
uses: appleboy/ssh-action@v0.1.3
with:
host: ${{ secrets.EC2_HOST }}
username: ${{ secrets.EC2_USER }}
key: ${{ secrets.EC2_KEY }}
script: |
cd /path/to/project &&
git pull origin main &&
docker-compose down &&
docker-compose up -d –build
- Add Secrets in GitHub:
- Go to Settings > Secrets in your GitHub repository and add the following secrets:
- EC2_HOST: Your EC2 public IP.
- EC2_USER: The username for the EC2 instance (e.g., ec2-user).
- EC2_KEY: The private SSH key to access your EC2 instance.
- Go to Settings > Secrets in your GitHub repository and add the following secrets:
Once set up, every push to the main branch will trigger a deployment, ensuring that your EC2 instance is always running the latest version of your Python application.
Conclusion
By combining AWS EC2, Docker, and GitHub Actions, you can create a streamlined, automated deployment pipeline for your Python applications. This setup not only simplifies deployment but also ensures that your applications are consistently and reliably deployed with minimal manual intervention.
With the steps outlined in this guide, you can focus more on writing code and less on managing deployments, leveraging the power of cloud automation.
References
Integrating with GitHub Actions – CI/CD pipeline to deploy a Web App to Amazon EC2