AWS

Browse posts by tag

AI-Powered Document Enhancement System

January 30, 2026

Refyne Pro

Live Application: refynepro.intelliumx.com

Refyne Pro Interface

Overview

Refyne Pro is an advanced AI-powered document enhancement platform that transforms how professionals create and refine content. Built on AWS serverless architecture with Amazon Bedrock integration, it combines sophisticated AI capabilities with intuitive document management, offering unprecedented control over AI-generated content while maintaining enterprise-grade security and cost efficiency.

Background

The journey began with “Refyne,” a simple note-taking application designed to explore AI operations. It featured two text areas where users could input text and receive AI-enhanced versions through Amazon Bedrock. While functional, this foundational project revealed the potential for something far more powerful.

AWS Credentials for CLI (Profile)

December 23, 2025

🚀 Quick Start Guide

  • Create a named profile using aws configure --profile [name]
  • Never set a default profile permanently
  • This avoids accidental operations on the wrong AWS account
  • Activate profiles temporarily per session using $env:AWS_PROFILE
  • Or activate per command using --profile
  • Always clear the active profile when done to prevent unintended AWS operations

âś… 1. Create the profile (once only)

In PowerShell:

aws configure --profile nob

This creates:

  • ~\.aws\credentials
  • ~\.aws\config

âś… 2. Use the profile temporarily in PowerShell

Option A — Set environment variable only for the current session

API Gateway - Configuration

June 15, 2025

1. Create REST API

  1. Go to API Gateway console
  2. Create new REST API

  1. Create new resource and method
    • Add resource: e.g., “/user-list”
    • Add GET method
    • Integration type: Lambda Function
    • Select your Lambda function
  2. Enable CORS if needed
    • Actions → Enable CORS
    • Accept default settings for testing

2. Update “Method request”

3. Update “Integration request”

{
    "limit": "$input.params('limit')"
}

4. Deploy and Test

  1. Deploy API

  1. Note the API endpoint URL

API Gateway - Usage Plan

June 15, 2025

1. Create new usage plan

Rate and Burst

  • Rate: Set to 10-20 requests per second for development/testing
    • Recommended: Start with 10 req/sec for controlled testing
  • Burst: Set to 2x your rate (20-40)
    • Recommended: Start with 20 to handle short traffic spikes

Quota Settings

  • Quota period: MONTH (most common)
    • Alternative periods: WEEK, DAY
  • Requests per quota period: Start with 50,000/month
    • This allows approximately 1,600 requests per day
    • Can be adjusted based on actual usage patterns

Recommended Initial Configuration:

AWS STS - Temporary Access Tokens

June 15, 2025

1. Generate Temporary Credentials

First, use the AWS STS (Security Token Service) to generate temporary credentials:

# 3600 x 5 = 18000 (5 hours)
aws sts get-session-token --duration-seconds 18000

This will return something like:

{
    "Credentials": {
        "AccessKeyId": "ASIA...",
        "SecretAccessKey": "...",
        "SessionToken": "...",
        "Expiration": "2025-06-13T..."
    }
}

2. Set Environment Variables

Then set these environment variables:

# Replace the values with your actual credentials from the previous step.
export AWS_ACCESS_KEY_ID="your_access_key"
export AWS_SECRET_ACCESS_KEY="your_secret_key"
export AWS_SESSION_TOKEN="your_session_token"
export AWS_DEFAULT_REGION="ap-southeast-2"  # Sydney region

3. Verify the environment variables

env | grep AWS

After setting these variables, try running your Python script again. The credentials will be automatically picked up by the AWS SDK.

AWS Lambda - Create a Function

June 15, 2025

  1. Navigate to Lambda in AWS Console
  2. Click “Create function”
    • Choose “Author from scratch”
    • Runtime: Python 3.x
    • Name: e.g., “get-user-list”

Paste the Python code into “Code” page and click “Deploy” button

import boto3
from datetime import datetime
from boto3.dynamodb.conditions import Key

dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('user_list')

def create_nested_structure(data, current_level, max_level):
    if current_level >= max_level:
        return data
    
    return {
        f"level_{current_level}": {
            "data": data,
            "nested": create_nested_structure(data, current_level + 1, max_level),
            "metadata": {
                "level_info": f"This is level {current_level}",
                "timestamp": datetime.now().isoformat(),
                "metrics": {
                    "depth": current_level,
                    "remaining_levels": max_level - current_level,
                    "complexity_score": max_level * current_level
                }
            }
        }
    }

def create_complex_response(user_data, nested_level):
    base_data = {
        "id": f"user_{user_data['user_id']}",
        "timestamp": datetime.now().isoformat(),
        "category": "Personnel",
        "details": {
            "name": {
                "first": user_data['first_name'],
                "last": user_data['last_name']
            },
            "company": {
                "name": user_data['company_name'],
                "web": user_data['web']
            },
            "contact_info": {
                "address": {
                    "street": user_data['address'],
                    "city": user_data['city'],
                    "state": user_data['state'],
                    "postcode": user_data['post']
                },
                "communication": {
                    "phones": [
                        {
                            "type": "primary",
                            "number": user_data['phone1']
                        },
                        {
                            "type": "secondary",
                            "number": user_data['phone2']
                        }
                    ],
                    "email": user_data['email']
                }
            }
        }
    }
    
    return create_nested_structure(base_data, 1, nested_level)

def lambda_handler(event, context):
    try:
        # Get parameters from event body
        limit = int(event.get('limit', 10) if event.get('limit') else 10)
        nested_level = int(event.get('nested_level', 1) if event.get('nested_level') else 1)
        
        # Validate nested_level
        if nested_level < 1:
            nested_level = 1
        elif nested_level > 30:  # Set a reasonable maximum 
            nested_level = 30    # 29 nested is the limit on Blue Prism
            
        # Scan DynamoDB table with limit
        response = table.scan(
            Limit=limit
        )
        items = response.get('Items', [])
        
        # Transform items into complex nested structure
        transformed_data = [create_complex_response(item, nested_level) for item in items]
        
        # Create final response
        return {
            "statusCode": 200,
            "headers": {
                "Content-Type": "application/json",
                "Access-Control-Allow-Origin": "*"
            },
            "success": True,
            "timestamp": datetime.now().isoformat(),
            "total_records": len(transformed_data),
            "limit_applied": limit,
            "nesting_level": nested_level,
            "data": transformed_data,
            "metadata": {
                "api_version": "1.0",
                "service": "user-data-api",
                "complexity_info": {
                    "max_depth": nested_level,
                    "structure_type": "recursive",
                    "total_nodes": len(transformed_data) * nested_level
                }
            }
        }
        
    except Exception as e:
        return {
            "statusCode": 500,
            "success": False,
            "message": "Error processing request",
            "error": str(e)
        }

AWS Credentials for CLI

June 15, 2025

1. Using AWS CLI Configuration

aws configure

This will prompt you to enter:

  • AWS Access Key ID
  • AWS Secret Access Key
  • Default region name
  • Default output format

2. Environment Variables

export AWS_ACCESS_KEY_ID="your_access_key"
export AWS_SECRET_ACCESS_KEY="your_secret_key"
export AWS_DEFAULT_REGION="your_region"

3. Credentials File

Create or edit ~/.aws/credentials:

[default]
aws_access_key_id = your_access_key
aws_secret_access_key = your_secret_key

4. Clear AWS CLI Configuration (OPTIONAL)

To clear your AWS CLI credentials, you have several options:

  • Delete the credentials file: rm ~/.aws/credentials
  • Delete the config file: rm ~/.aws/config
  • Clear specific profile: aws configure --profile your_profile_name and press Enter without entering values
# Remove both credentials and config files
rm ~/.aws/credentials ~/.aws/config

After clearing the credentials, you can reconfigure them using any of the methods described above.

AWS Lambda - Grant Access

June 15, 2025

  1. Go to AWS IAM Console

  2. Find your Lambda’s role

    • Click on the role name
    • Click “Add permissions” → “Create inline policy”
  3. In the JSON editor, paste this policy:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "dynamodb:Scan",
                    "dynamodb:GetItem",
                    "dynamodb:Query"
                ],
                "Resource": "arn:aws:dynamodb:ap-southeast-2:6850********:table/user_list"
            }
        ]
    }
    
  4. Click “Review policy”

    • Name it something like “DynamoDBScanPolicy”
    • Click “Create policy”

After adding this policy, wait a few seconds and try your Lambda function again. The error should be resolved.

Amazon DynamoDB - Import CSV Data

June 15, 2025

1. Save the CSV file in the same location as the Python code

user_id,first_name,last_name,company_name,address,city,state,post,phone1,phone2,email,web
U001,Rebbecca,Didio,"Brandt, Jonathan F Esq",171 E 24th St,Leith,TAS,7315,03-8174-9123,0458-665-290,[email protected],http://www.brandtjonathanfesq.com.au
U002,Stevie,Hallo,Landrum Temporary Services,22222 Acoma St,Proston,QLD,4613,07-9997-3366,0497-622-620,[email protected],http://www.landrumtemporaryservices.com.au
U003,Mariko,Stayer,"Inabinet, Macre Esq",534 Schoenborn St #51,Hamel,WA,6215,08-5558-9019,0427-885-282,[email protected],http://www.inabinetmacreesq.com.au
U004,Gerardo,Woodka,Morris Downing & Sherred,69206 Jackson Ave,Talmalmo,NSW,2640,02-6044-4682,0443-795-912,[email protected],http://www.morrisdowningsherred.com.au
U005,Mayra,Bena,"Buelt, David L Esq",808 Glen Cove Ave,Lane Cove,NSW,1595,02-1455-6085,0453-666-885,[email protected],http://www.bueltdavidlesq.com.au
U006,Idella,Scotland,Artesian Ice & Cold Storage Co,373 Lafayette St,Cartmeticup,WA,6316,08-7868-1355,0451-966-921,[email protected],http://www.artesianicecoldstorageco.com.au
U007,Sherill,Klar,Midway Hotel,87 Sylvan Ave,Nyamup,WA,6258,08-6522-8931,0427-991-688,[email protected],http://www.midwayhotel.com.au
U008,Ena,Desjardiws,"Selsor, Robert J Esq",60562 Ky Rt 321,Bendick Murrell,NSW,2803,02-5226-9402,0415-961-606,[email protected],http://www.selsorrobertjesq.com.au
U009,Vince,Siena,Vincent J Petti & Co,70 S 18th Pl,Purrawunda,QLD,4356,07-3184-9989,0411-732-965,[email protected],http://www.vincentjpettico.com.au
U010,Theron,Jarding,"Prentiss, Paul F Esq",8839 Ventura Blvd,Blanchetown,SA,5357,08-6890-4661,0461-862-457,[email protected],http://www.prentisspaulfesq.com.au

2. Set a temporary token for VS Code

Reference: AWS STS - Temporary Access Tokens

Create a MS SQL Server Container

April 4, 2025

# This is the current folder structure
sh-5.2$ tree
.
├── Dockerfile
├── backups
│   ├── APP-6.3.2-lab_Stage_2.bak
│   ├── APP-6.3.2-lab_Stage_3.bak
│   ├── APP-6.3.2-lab_Stage_4.bak
│   ├── v9.1.23_APP_632_lab_Stage_3.bak
│   └── v9.1.23_APP_632_lab_Stage_4.bak
├── certs
│   ├── server-bundle.crt
│   └── server.key
├── containers
│   └── sql1
│       ├── data [error opening dir]
│       ├── log [error opening dir]
│       └── secrets [error opening dir]
└── mssql.conf
  1. Create Dockerfile file
FROM mcr.microsoft.com/mssql/server:2022-latest

USER root

# Install required dependencies
RUN apt-get update && \
    apt-get install -y curl apt-transport-https gnupg2 && \
    mkdir -p /etc/apt/keyrings && \
    curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > /etc/apt/keyrings/microsoft.gpg && \
    chmod 644 /etc/apt/keyrings/microsoft.gpg && \
    echo "deb [signed-by=/etc/apt/keyrings/microsoft.gpg] https://packages.microsoft.com/ubuntu/22.04/prod jammy main" > /etc/apt/sources.list.d/mssql-release.list && \
    apt-get update && \
    ACCEPT_EULA=Y apt-get install -y mssql-tools unixodbc-dev && \
    ln -s /opt/mssql-tools/bin/sqlcmd /usr/bin/sqlcmd && \
    ln -s /opt/mssql-tools/bin/bcp /usr/bin/bcp && \
    apt-get clean && \
    rm -rf /var/lib/apt/lists/*

# Switch back to default user
USER mssql
  1. Create mssql.conf file
[network]
tlscert = /var/opt/mssql/secrets/server-bundle.crt
tlskey = /var/opt/mssql/secrets/server.key
tlsprotocols = 1.2
forceencryption = 1
  1. Build an image
# Build new image
sudo docker build -t mssql-with-tools .
  1. Test locally
# Run new container
sudo docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=Password123' \
-p 1433:1433 \
-v /data/containers/sql1/data:/var/opt/mssql/data \
-v /data/containers/sql1/log:/var/opt/mssql/log \
-v sql-certs:/var/opt/mssql/secrets:ro \
-v /data/mssql.conf:/var/opt/mssql/mssql.conf:ro \
-v /data/backups:/var/opt/mssql/backups \
--restart always \
--name sql1 \
-d mssql-with-tools
  1. Build a custom container and push into ECR in AWS.
# The container URI is below
ACCOUNTID.dkr.ecr.ap-southeast-2.amazonaws.com/gcs-sql-server:latest
  1. Then run the script to deploy a MS SQL Container
#=============================================================================
# The following approach successfully copy "server.key"
#=============================================================================
# Create a Docker volume for the certificates
sudo docker volume create sql-certs

# Copy the necessary certificate files into the volume
sudo cp /data/certs/server-bundle.crt /var/lib/docker/volumes/sql-certs/_data/
sudo cp /data/certs/server.key /var/lib/docker/volumes/sql-certs/_data

# Change the ownership
sudo chown -R 10001:0 /var/lib/docker/volumes/sql-certs/_data/
sudo chmod -R 600 /var/lib/docker/volumes/sql-certs/_data/

# Retrieve an authentication token and authenticate your Docker client to your registry. Use the AWS CLI:
aws ecr get-login-password --region ap-southeast-2 | sudo docker login --username AWS --password-stdin ACCOUNTID.dkr.ecr.ap-southeast-2.amazonaws.com

# Deploy MS SQL Server container
sudo docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=Password123' \
-p 1433:1433 \
-v /data/containers/sql1/data:/var/opt/mssql/data \
-v /data/containers/sql1/log:/var/opt/mssql/log \
-v sql-certs:/var/opt/mssql/secrets:ro \
-v /data/mssql.conf:/var/opt/mssql/mssql.conf:ro \
-v /data/backups:/var/opt/mssql/backups \
--restart always \
--name sql1 \
-d ACCOUNTID.dkr.ecr.ap-southeast-2.amazonaws.com/gcs-sql-server:latest
  1. After the deployment, check the status of the container
# Check the login
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'Password123'

#Check the files
sudo docker exec -it sql1 ls -l /var/opt/mssql/backups

Backup Restore Database by sqlcmd

April 3, 2025

1. Taking Full Backups with sqlcmd

# Run the commands when you reach an important point in the database configuration
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P 'Password123' -Q "BACKUP DATABASE [v7.3.1_HUB_511_lab] TO DISK = '/var/opt/mssql/backups/v7.3.1_HUB_511_lab_Stage_3.bak' WITH FORMAT, INIT, NAME = 'Stage3';"
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P 'Password123' -Q "BACKUP DATABASE [HUB-5.1.1-lab] TO DISK = '/var/opt/mssql/backups/HUB-5.1.1-lab_Stage_3.bak' WITH FORMAT, INIT, NAME = 'Stage3';"

# Check the result
sudo docker exec -it sql1 ls -l /var/opt/mssql/backups/

2. Restoring a Specific Backup

# Restore databases
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P 'Password123' -Q "RESTORE DATABASE [v7.3.1_HUB_511_lab] FROM DISK = '/var/opt/mssql/backups/v7.3.1_HUB_511_lab_Stage_3.bak' WITH REPLACE, RECOVERY;"
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P 'Password123' -Q "RESTORE DATABASE [HUB-5.1.1-lab] FROM DISK = '/var/opt/mssql/backups/HUB-5.1.1-lab_Stage_3.bak' WITH REPLACE, RECOVERY;"

3. Restoring a Specific Backup via SSM

# Restore database via SSM
aws ssm send-command \
    --instance-ids "i-0e0df3af14a11b3d1" \
    --document-name "AWS-RunShellScript" \
    --parameters 'commands=[
        "sudo docker exec sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P '\''Password123'\'' -Q \"RESTORE DATABASE [v7.3.1_HUB_511_lab] FROM DISK = '\''/var/opt/mssql/backups/v7.3.1_HUB_511_lab_Stage_3.bak'\'' WITH REPLACE, RECOVERY;\""
    ]' \
    --region "ap-southeast-2"
# Check the Log in case of failure
aws ssm list-command-invocations --command-id abab87ca-7abb-4746-8666-fa6ebbe67b51 --details

Copy Files from a Docker to S3

April 2, 2025

Backup files from Docker Container

  1. Login to the machine running the Docker Container
  2. Copy back files in Docker container to the current directory
sudo docker cp sql1:/var/opt/mssql/backups/HUB-5.1.1-lab_Stage_2.bak ./HUB-5.1.1-lab_Stage_2.bak
sudo docker cp sql1:/var/opt/mssql/backups/HUB-5.1.1-lab_Stage_3.bak ./HUB-5.1.1-lab_Stage_3.bak
sudo docker cp sql1:/var/opt/mssql/backups/HUB-5.1.1-lab_Stage_4.bak ./HUB-5.1.1-lab_Stage_4.bak
sudo docker cp sql1:/var/opt/mssql/backups/v7.3.1_HUB_511_lab_Stage_3.bak ./v7.3.1_HUB_511_lab_Stage_3.bak
sudo docker cp sql1:/var/opt/mssql/backups/v7.3.1_HUB_511_lab_Stage_4.bak ./v7.3.1_HUB_511_lab_Stage_4.bak
  1. Upload them to S3 bucket
# Change the ownership of the files:
sudo chown ssm-user:ssm-user *.bak

# Create a timestamp variable
TIMESTAMP=$(date +%Y%m%d-%H%M%S)

# Upload both files to the timestamped folder
aws s3 cp HUB-5.1.1-lab_Stage_2.bak s3://gcs-share/db-backup/$TIMESTAMP/
aws s3 cp HUB-5.1.1-lab_Stage_3.bak s3://gcs-share/db-backup/$TIMESTAMP/
aws s3 cp HUB-5.1.1-lab_Stage_4.bak s3://gcs-share/db-backup/$TIMESTAMP/
aws s3 cp v7.3.1_HUB_511_lab_Stage_3.bak s3://gcs-share/db-backup/$TIMESTAMP/
aws s3 cp v7.3.1_HUB_511_lab_Stage_4.bak s3://gcs-share/db-backup/$TIMESTAMP/

Upload Docker Image to ECR

March 29, 2025

Configure in AWS management console

  1. Stay in the working directory where Dockerfile is located (e.g., ~/gcs-rabbit)

  2. Open Repository page in Amazon ECR

  3. Create a repository by the code below

aws ecr create-repository --repository-name gcs-normal-rabbit --region ap-southeast-2

  1. Click “View push command” and follow the instruction with sudo command


See also:

RabbitMQ Container - HTTP

RabbitMQ Container - SSL

AppSteram 2.0 Image Builder

March 27, 2025

  1. Select the right image to be updated

  1. Configure Image Builder

  1. Configure Network

  1. Review

  1. Confirmation

sysprep in AWS

March 27, 2025

  1. Delete all the items controlled by Group Policy (e.g., Certificates, Firewall Settings)
  2. Open “Amazon EC2Launch Settings” and click ”Shutdown with Sysprep”

Managing AWS Accounts in Terminal

February 27, 2025

Register AWS Accounts to the Terminal

  1. Set AWS Credential

  1. The command to check the Current AWS Credentials
aws sts get-caller-identity
  1. The command to clear the AWS Account from the terminal
unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY
unset AWS_SESSION_TOKEN

Deploy a Amazon Linux 2023

February 25, 2025

Deploy a Linux machine

  1. Update OS
sudo yum update -y
  1. Update Hostname and check it
sudo hostnamectl set-hostname DEV-VAR-OIDC2.apj.cloud
hostnamectl
  1. Update TimeZone and check it
sudo timedatectl set-timezone Australia/Sydney
timedatectl
  1. DNS Settings - Make sure all the DNS servers are registered
sudo vi /etc/resolv.conf

  1. Install some components for any Linux OS
sudo yum install sssd-ad sssd-tools realmd adcli
  1. Install some components for Amazon Linux 2023.
sudo yum install oddjob oddjob-mkhomedir
  1. Check the status of Active Directory
realm discover apj.cloud

Setup Fleet Manager

February 9, 2025

How to Enable GUI Access via Fleet Manager

  1. Ensure SSM Agent is Installed and Running
    • Windows EC2 instances must have the “SSM Agent” installed and running.
    • Check the status by the powershell command
Get-Service AmazonSSMAgent
  1. Attach a Role with the following policies
    • AmazonSSMManagedInstanceCore
    • AmazonSSMFullAccess (This is required for GUI access via Fleet Manager)


How to access to EC2 via Fleet Manager

  1. Go to “Systems Manager” → “Fleet Manager”

AI-Powered AWS Exam Preparation Platform

January 30, 2025

Study Amplify

Live Application: studyamplify.intelliumx.com

Study Amplify Interface

Overview

Study Amplify is an AI-powered exam preparation platform designed to help professionals cloud certifications through personalised learning experiences. Built on AWS serverless architecture with Amazon Bedrock integration, it transforms traditional study methods by generating customised practice questions, providing intelligent mock tests, and enabling comprehensive review tracking.

Background

The challenge of preparing for cloud certifications lies in finding relevant, up-to-date practice questions that match real exam scenarios. Traditional study materials often become outdated quickly, and generic question banks don’t adapt to individual learning needs.

AWS SQS Learning Platform

January 30, 2025

Serverless Cafe

Live Application: cafe.nobuops.com

Overview

The Cafe Order System is a serverless application specifically designed as a learning platform for understanding AWS SQS (Simple Queue Service) and system decoupling patterns. Built with simplicity and educational value in mind, it demonstrates how to effectively decouple system components using message queues while maintaining reliability and scalability.

Background

When learning about distributed systems and AWS services, I wanted to understand how to properly decouple system components using message queues. While there are many theoretical resources available, I needed a practical, hands-on example that would help me grasp the concepts through real implementation.

Personal Health Monitoring Platform

January 30, 2025

Heartbeat Central

Live Application: heartbeat.intelliumx.com

Overview

Heartbeat Central is a personal health monitoring platform that simplifies blood pressure and body temperature tracking for elderly users and non-technical individuals. Built with AWS serverless architecture, it combines an intuitive user interface with enterprise-grade security, making health monitoring accessible to everyone.

Background

The inspiration came from a simple family moment—receiving a blood pressure monitor from my parents. Based on my age, I started thinking about monitoring my health more systematically.

Serverless AWS Cost Monitoring

January 30, 2025

CloudLens

Live Application: cloudlens.intelliumx.com

Overview

CloudLens is a serverless AWS cost monitoring dashboard that quickly detects unexpected cost increases. Built with AWS-native services, CloudLens delivers real-time cost visibility through simple charts and time-based filtering while maintaining enterprise-grade security and minimal operational overhead.

Background

I used to check AWS costs through Cost Explorer in the AWS Management Console. This was tedious—I had to log in, open Cost Explorer, and filter the data each time. I wanted quick access whenever I needed to check costs, ideally from my mobile device.