Hi! I'm a self-taught developer passionate about understanding and solving problems to empower end-users.
I have a particular enthusiasm for AWS services and automation to improve processes and create time to explore new solutions.
Forensic Science Major with a background of a Mediocal Technologist and currently a System Administrator. I help support network and power for a government testing facility that is beign utilized to transform the Military's Healthcare system.
My previous role as a Software Tester
Having worked all shift hours and volunteered for COVID antigen testing during the Pandemic,
I have honed my adaptability, self-learning ability, and ability to consistently perform at a high standard under pressure.
Having many roles has allowed me to see new perspectives and create a positive environment.
GitLab CI/CD pipeline for Nginx Proxy: dev and release Docker images built and pushed to
AWS ECR
Django/ python ready Nginx proxy for multiple environments.
Continuously improve Docker image by automating build, tagging and pushing of images via
GitLab. Tagging and pushing triggered via branch name. Developer only needs to be concerned with new
feature code and maintainer approves merge
Services and tools: Docker, GitLab, AWS ECR, Nginx
Terraform, Ansible, AWS: Automated webserver for container development
Terraform builds a Jenkins server as well as a docker web server. Ansible patches to prevent
configuration drift. A pipeline is now in place where Jenkins pulls dockerfile from Github, then
SSH's into EC2 to run and build container for continuous integration.
REACT ready Docker container for DEV and PROD environments using multi-compose.yaml.
Dev environment built on node with bind bount for faster code changes and immediate feedback.
Utilized layers to enhance performance of build.
Prod environment from DEV container files to generate static files for Nginx server
Services:
React/ JS, Node, Nginx, Docker
User profile in Django, automated provisioning local. Life & shift migration to AWS
Automated dev environment and server locally with Vagrant.
Setup Django admin and setup DB to store profiles. Functions for CRUD onto user profiles
repository to store source code and deploy bash scripts for AWS lift and shift.
AWS Orgs with SSO, billing, cloudTrail logs, Role switching
Create an Organization with multiple AWS accounts. Billing alarms, more uniformed permissions
and logs. IAM Admin with MFA. Admin in one account able to
assume a role and act as an admin in another account to provide support for that department.
Services and tools: AWS Organization, cloudTrail, SSO, IAM, cloudWatch
Custom VPC with Bastion host
to access private data with external update access + monitoring via Flow Logs.
Private and public subnets to separate visibility inside VPC. Internal access is granted via
bastion host and port forwarded to private instance.
External access granted for updates using NAT Gateway and IGW. Security logs audit at VPC level for access and logged into S3 buckets for analysis and transformation
Bot sends text messages to phone via Twilio API. Messages consist of positive messages and reminders
to study
Services and tools: Twilio, Python
EBS Volume Management and backups for disaster recovery
Volumes support persistent data storage, allowing users to create, attach, and manage storage volumes independently from their EC2 instances.
Management of volumes includes regular monitoring, snapshot for data protection and disaster recovery strategies.
Services:
EC2, EBS Volumes, EBS Snapshots, IAM
AWS Service Catalog to provision Testing environment
Configured AWS Service Catalog to allow users to create EC2
instances while limiting the instance types to cost-effective options. This
approach ensures that users can gain valuable experiences without
incurring significant expenses.
Services:
Service Catalog, EC2, IAM
Automate S3 bucket dropbox for files using Terraform
Automated the creation of S3 buckets to facilitate quick file uploads, effectively establishing an ephemeral Dropbox-like functionality using Terraform.
Terraform dynamically generates S3 bucket names based on the username and creation timestamp, enhancing file retrieval speed and enabling efficient data transformation.
Once the file download is complete, both the bucket and the file can be deleted with a single command, mitigating the risk of residual costs associated with unused resources.