JEREMY COX
******@*****.***
SKILLS
Programming Skills: SQL, Python, R, MATLAB, Tensor ow, AWS, Spark, Docker Theoretical Knowledge: ETL j EC2 j S3 j Hive j A/B testing j ML j ggplot j NumPy j Pandas j Scikit-learn KNN j MongoDB j Na ve Bayes j Snow ake j GCP j Lambda Education: Florida A&M University Major: Industrial Engineering Graduation Date: 2015 EXPERIENCE
United States Postal Service May 2022 - Present
Decision Science Analyst Indianapolis, IN
Created dashboards using Python and PowerBI. Served 20 colleagues across 3 teams. Enhanced data visualization and decision-making e ciency by 40%
Automated reports using SQL and Python. Reduced turnaround time from 2 days to 4 hours. Saved 1,000 man-hours annually.
Developed transportation network models with Networkx. Reduced model training time from 24 to 14 hours. Saving 1,500 processing time a year.
Identi ed data anomalies that increased database accuracy by 20%. Supported 4 teams with reliable data United States Postal Service January 2018 - May 2022 Operations Engineer Indianapolis, IN
Lead Continuous Improvement team to improve established service score by 42% Used Minitab and SAP
Collaborated with craft employees and management using SQL and Python. Improved operational e cacy by 30%. Supported 10 departments
Justi ed new equipment with SQL and R analysis. Saved $177,000 annually. Enhanced operational e ciency by 50%.
Designed Value Stream Maps with PowerBI. Identi ed process improvements. Increased productivity by 20% across 4 teams
PROJECTS
DBT Glue Implementation January 2025
Collaborated with stakeholders to develop optimized data models within Snow ake, which improved query performance and streamlined data retrieval for analytical purposes.
Designed the architecture for an ETL pipeline incorporating AWS DynamoDB, AWS S3, AWS IAM, DBT, and Snow ake
Successfully managed the migration of stock historical market data from DynamoDB to Snow ake, ensuring minimal disruption during the process
Prepared comprehensive documentation detailing the ETL work ows and data models using DBT's documen- tation feature
AWS Real time streaming with API Gateway Kinesis Firehose January 2025
Led the design and implementation of AWS-based real-time streaming pipelines for 100+ tables, using API Gateway, Kinesis Firehose, and Lambda, reducing data ingestion latency by 50% and enhancing real-time analytics e ciency
Led the creation of data ingestion streams using Kinesis Firehose to process data from Google Pubsub e ectively
Set up and managed AWS Direct Managed Services (DMS) tasks ensuring seamless data migration processes from RDBMS source systems
Worked cross-functionally with teams to collect requirements and design optimal data pipeline solutions