In this article, we’ll explore how AWS CloudFormation simplifies setting up and managing cloud infrastructure. Instead of manually creating resources like servers or databases, you can write down your requirements in a file, and CloudFormation does the heavy lifting for you. This approach, known as Infrastructure as Code (IaC), saves time, reduces errors, and ensures everything is consistent.
We’ll also look at how Docker and GitHub Actions fit into the process. Docker makes it easy to package and run your application, while GitHub Actions automates tasks like testing and deployment. Together with CloudFormation, these tools create a powerful workflow for building and deploying applications in the cloud.
This article was published as a part of theData Science Blogathon.
In the world of cloud computing, managing infrastructure efficiently is crucial. So, AWS CloudFormation comes into picture, that makes it easier to set up and manage your cloud resources. It allows you to define everything you need — servers, storage, and networking in a simple file.
AWS CloudFormation is a service that helps you define and manage your cloud resources using templates written in YAML or JSON. Think of it as creating a blueprint for your infrastructure. Once you hand over this blueprint, CloudFormation takes care of setting everything up, step by step, exactly as you described.
Infrastructure as Code (IaC), is like turning your cloud into something you can build, rebuild, and even improve with just a few lines of code. No more manual clicking around, no more guesswork — just consistent, reliable deployments that save you time and reduce errors.
Streamlining Code Documentation with AI: The Document Generation Project:
To start Cloud Formation, we need one sample project to deploy it in AWS.
I already created a project using Lang-chain and OPEN AI GPT-4. Let’s discuss about that project then we will have a look on how that project is deployed in AWS using cloud Formation.
GitHub code link: https://github.com/Harshitha-GH/CloudFormation
In the world of software development, documentation plays a major role in ensuring codebases are comprehensible and maintainable. However, creating detailed documentation is often a time-consuming and boring task. But we are techies, we want automation in everything. So to deploy a project in AWS using CloudFormation, I developed an automation project using AI (Lang-Chain and Open AI GPT-4) to create the Document Generation Project — an innovative solution that utilizes AI to automate the documentation process for Python code.
Here’s a breakdown of how we built this tool and the impact it aims to create. To create this project we are following a few steps.
Before starting a new project, we have to create a python environment to install all required packages. This will help us to maintain necessary packages.
I wrote a function to parse the input file , which typically takes a python file as an input and print the names of all functions.
Once the function details are extracted, the next step is to feed them into OpenAI’s GPT-4 model to generate detailed documentation. Using Lang-Chain, we construct a prompt that explains the task we want GPT-4 to perform.
prompt_template = PromptTemplate( input_variables=["function_name", "arguments", "docstring"], template=( "Generate detailed documentation for the following Python function:\n\n" "Function Name: {function_name}\n" "Arguments: {arguments}\n" "Docstring: {docstring}\n\n" "Provide a clear description of what the function does, its parameters, and the return value." ) )#import csv
With help of this prompt, Doc Generator function takes the parsed details and generates a complete, human-readable explanation for each function.
To make the tool user-friendly, I built a Flask API where users can upload Python files. The API parses the file, generates the documentation using GPT-4, and returns it in JSON format.
We can test this Flask API using postman to check our output.
To deploy into AWS and use our application, we need to containerize our application using docker and then use GitHub actions to automate the deployment process. We will be using AWS CloudFormation for the automation in AWS. Service-wise we will be using Elastic Container Registry to store our containers and EC2 for deploying our application. Let us see this step by step.
We will create the Docker file. The Docker file is responsible for spinning up our respective containers
prompt_template = PromptTemplate( input_variables=["function_name", "arguments", "docstring"], template=( "Generate detailed documentation for the following Python function:\n\n" "Function Name: {function_name}\n" "Arguments: {arguments}\n" "Docstring: {docstring}\n\n" "Provide a clear description of what the function does, its parameters, and the return value." ) )#import csv
Once Docker files are created, we will create a Docker compose file that will spin up the container.
# Use the official Python 3.11-slim image as the base image FROM python:3.11-slim # Set environment variables to prevent Python from writing .pyc files and buffering output ENV PYTHONDONTWRITEBYTECODE 1 ENV PYTHONUNBUFFERED 1 # Set the working directory inside the container WORKDIR /app # Install system dependencies required for Python packages and clean up apt cache afterwards RUN apt-get update && apt-get install -y --no-install-recommends \ gcc \ libffi-dev \ libpq-dev \ python3-dev \ build-essential \ && rm -rf /var/lib/apt/lists/* # Copy the requirements file to the working directory COPY requirements.txt /app/ # Upgrade pip and install Python dependencies without cache RUN pip install --no-cache-dir --upgrade pip && \ pip install --no-cache-dir -r requirements.txt # Copy the entire application code to the working directory COPY . /app/ # Expose port 5000 for the application EXPOSE 5000 # Run the application using Python CMD ["python", "app.py"]#import csv
You can test this by running the command
version: '3.8' services: app: build: context: . dockerfile: Dockerfile ports: - "5000:5000" volumes: - .:/app environment: - PYTHONDONTWRITEBYTECODE=1 - PYTHONUNBUFFERED=1 command: ["python", "app.py"]#import csv
After the command executes successfully, the code will function exactly as it did before.
I create an ECR repository. Apart from that we will make GitHub actions later to create all our other required services.
The repository, I have created has namespace cloud_formation repo name asdemo. Then, I will proceed with theCloudFormationtemplate, a yaml file that helps in spinning up required instance, pulling the images from ECR and other resources.
Instead of manually setting up servers and connecting everything, AWS CloudFormation is used to set up and manage cloud resources (like servers or databases) automatically using a script. It’s like giving a blueprint to build and organize your cloud stuff without doing it manually !
Think of CloudFormation as writing a simple instruction manual for AWS to follow. This manual, called as ‘template’, tells AWS to:
By using this automated setup, I don’t have to repeat the same steps every time I deploy or update the project — it’s all done automatically by AWS.
AWS CloudFormation templates are declarative JSON or YAML scripts that describe the resources and configurations needed to set up your infrastructure in AWS. They enable you to automate and manage your infrastructure as code, ensuring consistency and repeatability across environments.
docker-compose up –build#import csv
Let’s decode the updated template step by step:
We are defining a single ECR resource, which is the repository where our Docker image is stored.
Next, we create an EC2 instance. We’ll attach essential policies to it, mainly for interacting with the ECR and AWS Secrets Manager. Additionally, we attach a Security Group to control network access. For this setup, we will open:
At2.microinstance will be used, and inside theUser Datasection, we define the instructions to configure the instance:
Since only one Docker container is being used, this configuration simplifies the deployment process, while ensuring the backend service is accessible and properly configured.
Till now we have saved the secrets like Open AI key in config.py file. But, we cannot push this file to GitHub, as it containsSecrets. So, we use AWS Secrets manager to store our secrets and then retrieve it through our CloudFormation template.
Till now we have saved the secrets like Open AI key in config.py file. But, we cannot push this file to GitHub, as it containsSecrets. So, we use AWS Secrets manager to store our secrets and then retrieve it through our CloudFormation template.
GitHub Actions is used to automate tasks like testing code, building apps, or deploying projects whenever you make changes. It’s like setting up a robot to handle repetitive work for you !
Our major intention here is that as we push to a specific branch of github, automatically the deployment to AWS should start. For this we will select ‘main’branch.
Sign in to your github and follow the path below:
repository > settings > Secrets and variables > Actions
Then you need to add your secrets of AWS extracted from you AWS account, as in below image.
After storing, we will create a .github folder and, within it, a workflows folder. Inside the workflows folder, we will add a deploy.yaml file.
prompt_template = PromptTemplate( input_variables=["function_name", "arguments", "docstring"], template=( "Generate detailed documentation for the following Python function:\n\n" "Function Name: {function_name}\n" "Arguments: {arguments}\n" "Docstring: {docstring}\n\n" "Provide a clear description of what the function does, its parameters, and the return value." ) )#import csv
Here’s a simplified explanation of the flow:
Once everything is deployed, note down the IP address of the instance and then just call it using postman to check everything works fine.
In this article, we explored how to use AWS CloudFormation to simplify cloud infrastructure management. We learnt how to create an ECR repository, deploy a Dockerized application on EC2 instance, and automate the entire process using GitHub Actions for CI/CD. This approach not only saves time but also ensures consistency and reliability in deployments.
A. AWS CloudFormation is a service that enables you to model and provision AWS resources using Infrastructure as Code (IaC).
Q2. How does Docker integrate with AWS CloudFormation?A. Docker packages applications into containers, which can be deployed on AWS resources managed by CloudFormation.
Q3. What role does GitHub Actions play in this workflow?A. GitHub Actions automates CI/CD pipelines, including building, testing, and deploying applications to AWS.
Q4. Can I automate Python documentation generation with LangChain?A. Yes, LangChain and GPT-4 can generate and update Python documentation as part of your workflow.
Q5. What are the benefits of using IaC with AWS CloudFormation?A. IaC ensures consistent, repeatable, and scalable resource management across your infrastructure.
The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.
The above is the detailed content of AWS CloudFormation: Simplifying Cloud Deployments. For more information, please follow other related articles on the PHP Chinese website!