What I'm working on.
Status: MVP, deployed
FPV quadcopter post-flight telemetry visualization tool. Streamlit, Python, Docker, ECS. Deployed on AWS with modular Terraform configuration. Branch-based GitHub Actions for CI/CD.
Status: complete, deployed
An automated Python ETL pipeline for processing and visualizing COVID data. Lambda, DynamoDB, SNS, EventBridge. Deployed on AWS with Terraform. GitHub Actions for CI/CD.
Status: refactored as static site
Responsive Bootstrap CSS front end, Flask backend with authentication, relational database, and more. Deployed on AWS with Elastic Beanstalk. GitHub Actions for CI/CD.
Status: complete, deployed
Created for Forrest Brazeal's Cloud Resume Challenge. Hosted on AWS using Serverless Application Model, S3, Cloudfront, Lambda, DynamoDB, and more. GitHub Actions for CI/CD.
Streamlit application, authored in Python, that transforms an uploaded CSV file and displays flight metrics, a map with a flight path trace, and graphs of relevant sets of data. The app and its dependencies are packaged as a Docker image, uploaded to Amazon Elastic Container Registry, and launched serverlessly using AWS Fargate.
The Terraform configuration for this project is modularized, and several files are provided as templates for manipulation within GitHub Actions. The primary Actions (both for building and uploading the app's Docker image, and for deploying the app's infrastructure) are reusable, and can be called from either the dev of main branches of the repository.
Some of the challenges I solved during development:
Design narratives on Dev.to (click to read):
To-dos include: checks and conditionals for different CSV column headings, to accommodate different radio/flight controller combinations; automated unit-testing; ability for users to register/authenticate and upload telemetry files to S3.
An AWS Lambda function, authored in Python and using boto3 to interact with other AWS resources, downloads data related to COVID-19 and performs an ETL process. The data is written to DynamoDB, where it can be queried by a self-hosted instance of Redash to create a visualization dashboard. A message is sent to a Simple Notification Service topic whenever the process finishes or encounters an error. The Lambda function is triggered daily by an Amazon EventBridge rule.
I am particularly pleased with the choice I made to provision all of my infrastructure using Terraform. This was my first time using it, so I spent several hours learning the basics and applying that knowledge to my configuration. Terraform seems like a great tool to use in my pipelines going forward, especially if/as I branch out to other cloud providers.
Numerous challenges presented themselves throughout the development process. Some examples:
Future improvements may include: refactoring the Terraform configuration file; writing more comprehensive unit tests; and revisiting queries and visualizations in Redash.
This webapp was created as a capstone project for the Python Bootcamp taught by Angela Yu of The App Brewery. Some backend Flask code was written by me for an earlier project, while the frontend code is almost exclusively written from scratch. It is deployed on AWS Elastic Beanstalk via GitHub.
I am particularly pleased with two aspects of this application: styling the HTML with Bootstrap, a framework with which I had little experience; and figuring out the deployment process, which included using the AWS EB CLI tool locally, associating the application with a MySQL RDS instance, and setting up a GitHub CI/CD pipeline to automatically push changes to Elastic Beanstalk.
Examples of the many challenges this application presented me:
Future improvements may include: figuring out AJAX requests so that this webapp can be served through CloudFront (caching and CSRF don't play nicely); defining Elastic Beanstalk configuration settings with code, instead of manipulating the EB environment in the AWS console; improved blog formatting and pagination; more error-handling; some sort of alert (email?) system when users comment on a blog post.
I completed this project shortly after obtaining the AWS Solutions Architect Associate certification and just before I began an online bootcamp to learn Python. As I didn't know much code at that point, I chose to use a pre-built HTML/CSS template for the frontend. Loading the website triggers an AWS Lambda function that increases the visitor count in a DynamoDB table by one and returns the new count to the website.
This project contained many firsts for me, including implementation of a CI/CD pipeline, use of the Serverless Application Model CLI tool, and creation of an AWS Lambda function. I spent numerous research hours online as I worked towards completion of the website and its backend counter. I attempted to write the Lambda code myself, but ended up referencing several other similar functions to piece together a working solution.
Since the completion of this project, I have learned quite a bit more about coding and about the way these services function and work together. It would be an interesting exercise to see if I could author the Lambda function on my own at this point.
After working with the Pixe.la APIs through the console during a bootcamp project, I decided to create a graphical user interface application that would allow me to track my time spent coding and learning on a pixel graph. I reused the API-related code that I had authored during the bootcamp project and built the GUI from scratch.
This was my first coding side project, written over the course of a week as I moved on to other Python lessons. I am particularly pleased with my implementation of guizero, a wrapper for tkinter that I had not used before. I relied heavily on the guizero documentation for reference.
Some notable challenges:
For the time being, this application is complete. Future improvements may include: creating an executable using PyInstaller; adding the ability to register new users and create new graphs; displaying a graph in the app window if a library for doing so becomes available.