Skip to content

Commit

Permalink
reformatted AWS Section
Browse files Browse the repository at this point in the history
  • Loading branch information
tellmeY18 committed Oct 5, 2024
1 parent 9e5545d commit 618a720
Show file tree
Hide file tree
Showing 4 changed files with 380 additions and 255 deletions.
37 changes: 37 additions & 0 deletions docs/devops/Deploy/Care/AWS/1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Infrastructure Configuration

## Configuring the Virtual Private Cloud (VPC)

1. Access the **AWS Console** and locate the **VPC** service.
2. Initiate the creation of a new VPC by selecting the '**VPC and more**' option and name it `care-vpc`. This action will automatically generate a new VPC, along with associated subnets, route tables, and internet gateways.
3. The default settings will be applied automatically, but you can modify these according to your specific requirements.
4. Make sure **Internet Gateway** is attached to the VPC to enable external communication.

## Configuring the Relational Database Service (RDS)

1. From the **AWS Console**, navigate to the **RDS** service.
2. Create a new database instance using the `PostgreSQL` engine.
3. Assign DB cluster identifier as `care-db`
4. Set the Credential management as `Self managed` and provide the master username and password.
5. Set the Availability zone as Per requirement.
6. Configure the database instance size and storage capacity as needed.
7. Use the same VPC and subnet group that was created earlier.
8. Configure the security group to allow inbound traffic on port `5432` from all sources. (This can be restricted to the EC2 instance's internal IP address to enhance security.)
9. Set Public accessibility to `No` to restrict access to the database from the internet.

## Configuring the S3 Bucket

1. Locate the **S3** service in the **AWS Console**.
2. Create two new buckets and assign them the names `facility-data` and `patient-data`.
3. Adjust the permissions settings for each bucket: set `facility-data` to public and `patient-data` to private.
4. Configure the CORS policy for the `facility-data` and `patient-data` buckets to restrict access to specific domains after the deployment of the application.

## Configuring the Elastic Compute Cloud (EC2) Instance

1. Access the **EC2** service via the **AWS Console**.
2. Launch a new instance and select the `Ubuntu` image.
3. Choose the `t2.micro` instance type to remain within the free tier. (You can adjust this based on your requirements.)
4. Choose the VPC and subnet that were created earlier.
5. Configure the security group to allow inbound traffic on ports `22` `80` and `443` from all sources.
6. Assign a key pair to the instance to facilitate SSH access.
7. Configure the storage settings as required.
262 changes: 262 additions & 0 deletions docs/devops/Deploy/Care/AWS/2.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,262 @@
# Setting Up Build Pipelines

## Setting Up the ECR Repository

1. Navigate to the **ECR** service in the **AWS Console**.
2. Create private ECR repositories named `care` and `care_fe`.
3. Make sure that the repositories are **mutable**.
4. Use the **Default Encryption Key** for the repositories.

## Setting Up SSM Documents

1. Navigate to the **SSM** service in the **AWS Console**.
2. Create a new document named `trigger-docker`.
3. Add the following content to the document:
```yaml
schemaVersion: "2.2"
description: "Trigger Docker Container"
mainSteps:
- action: "aws:runShellScript"
name: "triggerDocker"
inputs:
runCommand:
- "bash /home/ubuntu/trigger-docker.sh"
```
## Setting Up CodeCommit Repository
1. Navigate to the **CodeCommit** service in the **AWS Console**.
2. Create a new repository named `infra-name`.
3. Add the files `build/react.env` and `plug_config.py` to the repository. (The `plug_config.py` file can be used to include plugins for `care`)
4. Add the following content to the `react.env` file:
```yaml
REACT_PLAUSIBLE_SERVER_URL=https://plausible.example.com
REACT_HEADER_LOGO='{"light":"https://cdn.ohc.network/header_logo.png","dark":"https://cdn.ohc.network/header_logo.png"}'
REACT_MAIN_LOGO='{"light":"https://cdn.ohc.network/light-logo.svg","dark":"https://cdn.ohc.network/black-logo.svg"}'
REACT_GMAPS_API_KEY="examplekey"
REACT_GOV_DATA_API_KEY=""
REACT_RECAPTCHA_SITE_KEY=""
REACT_SENTRY_DSN=""
REACT_SAMPLE_FORMAT_ASSET_IMPORT=""
REACT_SAMPLE_FORMAT_EXTERNAL_RESULT_IMPORT=""
REACT_KASP_ENABLED=""
REACT_ENABLE_HCX=""
REACT_ENABLE_ABDM=""
REACT_ENABLE_SCRIBE=""
REACT_WARTIME_SHIFTING=""
REACT_OHCN_URL=""
REACT_PLAUSIBLE_SITE_DOMAIN="care.example.com"
REACT_SENTRY_ENVIRONMENT=""
REACT_CARE_API_URL="https://care.example.com"
REACT_DASHBOARD_URL=""
```
5. Add the following content to the `plug_config.py` file (if required):
```
from plugs.manager import PlugManager
from plugs.plug import Plug
hcx_plugin = Plug(
name="hcx",
package_name="git+https://github.com/ohcnetwork/care_hcx.git",
version="@main",
configs={},
)
plugs = [hcx_plugin]
manager = PlugManager(plugs)
```

## Setting Up the CodeBuild Project

1. Navigate to the **Cloudbuild** service in the **AWS Console**.
2. Create a new build project named `deploy-care`.
3. Add the following build steps:
```yaml
version: 0.2
env:
variables:
AWS_DEFAULT_REGION: ap-south-1
ACCOUNT_ID: ${ACCOUNT_ID}
INSTANCE_ID: ${INSTANCE_ID}
BE_TAG: ${BE_TAG}
FE_TAG: ${FE_TAG}
METABASE_TAG: ${METABASE_TAG}
git-credential-helper: yes
phases:
install:
commands:
- echo "Installing necessary dependencies"
- yum install -y unzip
- echo "Environment Variables:"
- echo "AWS_DEFAULT_REGION=${AWS_DEFAULT_REGION}"
- echo "INSTANCE_ID=${INSTANCE_ID}"
- echo "BE_TAG=${BE_TAG}"
- echo "FE_TAG=${FE_TAG}"
- echo "ACCOUNT_ID=${ACCOUNT_ID}"
- echo "METABASE_TAG=${METABASE_TAG}"
pre_build:
commands:
- LOGIN_PASSWORD=$(aws ecr get-login-password --region $AWS_DEFAULT_REGION)
- echo $LOGIN_PASSWORD | docker login --username AWS --password-stdin ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com
- git clone https://git-codecommit.ap-south-1.amazonaws.com/v1/repos/infra-name infra/
build:
commands:
- echo "Building and pushing Docker images"
# Build and Push the Backend Image
- |
if [[ -n "${BE_TAG}" ]]; then
curl -L https://github.com/ohcnetwork/care/archive/${BE_TAG}.zip -o care.zip
unzip care.zip
mv care-${BE_TAG} care
cp infra/build/plug_config.py. care/plug_config.py
cp infra/build/. care
DOCKER_BUILDKIT=1 docker build -f ./care/docker/prod.Dockerfile \
-t ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care:${BE_TAG} \
-t ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care:latest ./care
docker push ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care:${BE_TAG}
docker push ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care:latest
fi
# Build and Push the Frontend Image
- |
if [[ -n "${FE_TAG}" ]]; then
curl -L https://github.com/ohcnetwork/care_fe/archive/${FE_TAG}.zip -o care_fe.zip
unzip care_fe.zip
mv care_fe-${FE_TAG} care_fe
cp infra/build/react.env care_fe/.env.local
DOCKER_BUILDKIT=1 docker build -f ./care_fe/Dockerfile \
-t ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care_fe:${FE_TAG} \
-t ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care_fe:latest ./care_fe
docker push ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care_fe:${FE_TAG}
docker push ${ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/care_fe:latest
fi
post_build:
commands:
- echo "Writing environment variables to JSON file..."
- |
aws ssm send-command \
--document-name "AWS-RunShellScript" \
--targets "Key=instanceids,Values=${INSTANCE_ID}" \
--parameters "{\"commands\":[
\"echo '{\\\"ACCOUNT_ID\\\": \\\"${ACCOUNT_ID}\\\", \\\"AWS_DEFAULT_REGION\\\": \\\"${AWS_DEFAULT_REGION}\\\", \\\"LOGIN_PASSWORD\\\": \\\"${LOGIN_PASSWORD}\\\"}' > /tmp/env_vars.json\"
]}"
- echo "Environment variables written to /tmp/env_vars.json"
- aws ssm send-command --document-name "trigger-docker" --targets "Key=instanceids,Values=${INSTANCE_ID}"
```
4. Select **Amazon Linux** as the operating system.
5. Enable **CloudWatch** logs.
6. Utilize the default service role and permit AWS to append necessary permissions to the role.
7. Establish the environment variables `ACCOUNT_ID` and `INSTANCE_ID` in the build project settings.
8. Clone the `care-docker` repository using **git**.
9. Populate the `.env` values with the required values.
10. Generate the file `trigger-docker.sh` with the following content:
```bash
#!/bin/bash
# Define variables
ENV_FILE="/tmp/env_vars.json"
DOCKER_COMPOSE_DIR="/home/ubuntu/care-docker"
LOG_DIR="/var/log/docker-operations"
LOG_FILE="$LOG_DIR/docker-operations.log"
PULL_LOG_FILE="$LOG_DIR/docker-compose-pull.log"
UP_LOG_FILE="$LOG_DIR/docker-compose-up.log"
DOWN_LOG_FILE="$LOG_DIR/docker-compose-down.log"
MAX_LOG_SIZE=10M
BACKUP_COUNT=5
# Ensure log directory exists
mkdir -p "$LOG_DIR"
# Function for logging
log() {
echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" >> "$LOG_FILE"
}
# Function to rotate logs
rotate_log() {
if [ -f "$1" ] && [ $(du -b "$1" | cut -f1) -ge $(numfmt --from=iec $MAX_LOG_SIZE) ]; then
for i in $(seq $((BACKUP_COUNT-1)) -1 1); do
[ -f "${1}.$i" ] && mv "${1}.$i" "${1}.$((i+1))"
done
mv "$1" "${1}.1"
touch "$1"
fi
}
# Rotate logs before starting
rotate_log "$LOG_FILE"
rotate_log "$PULL_LOG_FILE"
rotate_log "$UP_LOG_FILE"
# Read environment variables
if [ ! -f "$ENV_FILE" ]; then
log "Error: Environment file $ENV_FILE not found"
exit 1
fi
eval $(cat "$ENV_FILE" | jq -r '@sh "ACCOUNT_ID=\(.ACCOUNT_ID) AWS_DEFAULT_REGION=\(.AWS_DEFAULT_REGION) LOGIN_PASSWORD=\(.LOGIN_PASSWORD)"')
if [ -z "$ACCOUNT_ID" ] || [ -z "$AWS_DEFAULT_REGION" ] || [ -z "$LOGIN_PASSWORD" ]; then
log "Error: Required environment variables are not set"
exit 1
fi
# Perform Docker login
log "Attempting Docker login"
if echo "$LOGIN_PASSWORD" | docker login --username AWS --password-stdin "$ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" > /dev/null 2>&1; then
log "Docker login successful"
else
log "Error: Docker login failed"
exit 1
fi
# Change to the Docker Compose directory
if ! cd "$DOCKER_COMPOSE_DIR"; then
log "Error: Failed to change to directory $DOCKER_COMPOSE_DIR"
exit 1
fi
# Perform Docker pull
log "Starting Docker pull"
if docker compose pull >> "$PULL_LOG_FILE" 2>&1; then
log "Docker pull successful"
else
log "Error: Docker pull failed. Check the log file at $PULL_LOG_FILE"
exit 1
fi
# Perform Docker Compose down
log "Stopping existing containers"
if docker compose down >> "$DOWN_LOG_FILE" 2>&1; then
log "Docker Compose down successful"
else
log "Error: Docker Compose down failed. Check the log file at $DOWN_LOG_FILE"
exit 1
fi
# Perform Docker Compose up
log "Starting Docker Compose up"
if docker compose up -d >> "$UP_LOG_FILE" 2>&1; then
log "Docker Compose up successful"
else
log "Error: Docker Compose up failed. Check the log file at $UP_LOG_FILE"
exit 1
fi
log "Script completed successfully"
```
11. Make the script executable by executing the command `chmod +x trigger-docker.sh`.

# Setting Up the IAM Role and User Policies

### ssm-iam-ec2
- Create a user named `ssm-iam-ec2` and add the policies `AmazonSSMFullAccess` and `AmazonSSMManagedInstanceCore`.
- Attach the policy to the EC2 instance in the **IAM Role** field.

### codebuild-iam-role
- Add the policies `AmazonSSMFullAccess`, `AmazonSSMManagedInstanceCore`, `AWSCodeCommitReadOnly`, and `AmazonEC2ContainerRegistryFullAccess` to the role created during the Cloud Build setup, in addition to the default permissions set automatically.
77 changes: 77 additions & 0 deletions docs/devops/Deploy/Care/AWS/3.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
# Deploy

## Getting Ready For First Deployment

1. SSH into the EC2 instance either via the AWS Console or your terminal.
2. clone care-docker repository from the GitHub.
3. Navigate to the `care-docker` directory.
4. Ensure that docker is installed on the instance.
5. We are now ready to trigger the cloud build pipeline we setup in the previous steps.
6. Navigate to the codebuild service in the AWS Console.
7. Select the `deploy-care` project.
8. Click on the `Start Build With Overrides` button.
9. Refer to the [Care Commit History](https://github.com/ohcnetwork/care/commits/production) and [Care FE Commit History](https://github.com/ohcnetwork/care_fe/commits/production) to fetch the commit hashes and replace it in the environment variables BE_TAG and FE_TAG respectively.
10. Click on the `Start Build` button.

## Setting Up Triggers

The build/deploy pipeline can be triggered directly from the console as mentioned above, however for triggering can be done using the AWS CLI as well.

```bash
aws codebuild start-build --project-name deploy-care --environment-variables-override name=BE_TAG,value=<tag> name=FE_TAG,value=<tag>
```
This can be further automated by setting up a GitHub Action to trigger multiple codebuild projects.

**Prerequisites:**
- AWS Access Key ID and Secret Access Key for each project.
- Setup Access Key ID and Secret Access Key as GitHub Secrets.

```yaml
name: Trigger AWS CodeBuild for Multiple Projects
on:
workflow_dispatch:
inputs:
BE_TAG:
description: 'Backend Tag'
required: true
type: string
FE_TAG:
description: 'Frontend Tag'
required: true
type: string
jobs:
trigger-codebuild-projects:
strategy:
matrix:
project: [Example1, Example2, Example3, Example4]
runs-on: ubuntu-latest
steps:
- name: Configure AWS credentials for ${{ matrix.project }}
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets[format('AWS_ACCESS_KEY_ID_{0}', matrix.project)] }}
aws-secret-access-key: ${{ secrets[format('AWS_SECRET_ACCESS_KEY_{0}', matrix.project)] }}
aws-region: ap-south-1
- name: Trigger AWS CodeBuild for ${{ matrix.account }}
run: |
aws codebuild start-build \
--project-name deploy-care \
--environment-variables-override \
name=BE_TAG,value=${{ github.event.inputs.BE_TAG }},type=PLAINTEXT \
name=FE_TAG,value=${{ github.event.inputs.FE_TAG }},type=PLAINTEXT
```

## Applying Release Updates

The release updates can be applied by following the steps below:
1. Fetch the latest commit hash from the [Care Commit History](https://github.com/ohcnetwork/care/commits/production) and [Care FE Commit History](https://github.com/ohcnetwork/care_fe/commits/production).
2. Navigate to the deploy repository where the above mentioned GitHub Action is setup.
3. Click on the `Actions` tab.
4. Select the `Trigger AWS CodeBuild for Multiple Projects` workflow.
5. Click on the `Run Workflow` button.
6. Enter the latest commit hash for the backend and frontend.
7. Click on the `Run Workflow` button.

The above steps will trigger the build pipeline for the backend and frontend projects accross all the aws accounts setup in the GitHub Secrets.
Loading

0 comments on commit 618a720

Please sign in to comment.