Post Job Free
Sign in

Data Engineer Senior Software

Location:
Newark, DE, 19711
Salary:
120 - 160k
Posted:
August 28, 2025

Contact this candidate

Resume:

Leon Collins

Newark, DE ***** 302-***-**** ****.*******@******-********-********.*** LinkedIn Github SUMMARY

Senior Software Data Engineer with over 20 years of progressive experience in building and scaling mission-critical systems and data pipelines for enterprise-level organizations. Proven expertise in cloud-native development on AWS and Grafana Cloud, utilizing Kubernetes, Docker, and Cloud Foundry to modernize legacy platforms. A history of designing and implementing robust solutions for data security, fraud prevention, and B2B integration with platforms like Axway B2BI. Adept at leveraging data to drive business results, improve operational efficiency, and lead technical teams. Technical SKILLS Proficiencies

● Programming & Databases: Python (FastAPI, Flask), Go, Java, Perl, Bash/KSH, SQL (PostgreSQL, MySQL, PromQL, Oracle), RDS, Pandas, RedShift, Shell Scripting, YAML

● Cloud & DevOps: AWS (EC2, S3, RDS), Kubernetes, Docker, Grafana, Cloud Foundry, Terraform, Jenkins, GitLab, GitHub, Ansible, CI/CD, Spinnaker

● Data & Analytics: Data Architecture, Data Pipelines, ETL, A/B Testing, Splunk, Google Analytics, Prometheus, Dynatrace, Sentinel

● Middleware & Security: Axway B2BI, Axway Secure Transport, IBM Sterling Connect Direct, OAuth 2.0, PGP, SSH, SSL, SFTP, LDAP, Keon, Route53

● Operating Systems: Red Hat Linux, Unix, Mainframe

● Tools & Methodologies: Jira, Swagger/OpenAPI, Git, Agile, Scrum, Unit Testing

● Theoretical Knowledge: ETL Redshift EC2 EMR S3 Snowflake A/B testing ML NumPy Snowflake GCP Lambda

EXPERIENCE

JPMorganChase(TekSystems) Wilmington, DE

Senior Software Data Engineer March 2018- Dec 2024

● Engineered and deployed highly scalable Python(Boto3) on AWS EC2, leveraging Kubernetes for orchestration of Grafana (data visualization service applications. This enabled the critical exchange of Grafana Observability data, resulting in a 15% increase in data processing efficiency.

● Led the strategic migration of high-volume Grafana ( metrics, logs, and traces) data pipelines to a cloud-native architecture on AWS, utilizing Cloud Foundry and Terraform for scalable data ingestion. This initiative reduced operational costs by 36% and enhanced data resilience.

● Automated complex software delivery workflows by designing and utilizing infrastructure as code(IaC), and CI/CD pipelines

(GitHub, Jenkins, Spinnaker), significantly cutting release cycles for data-intensive applications by 15 days.

● Optimized data persistence and exchange using Redshift, RDS, and Amazon S3, ensuring 99.9% uptime and reliable data access for APIs supporting 100 million metrics per tenant daily.

● Analyzed system performance using advanced data concepts and Python scripting to proactively identify bottlenecks, driving a 10% improvement in API response times.

JPMorganChase Wilmington, DE

Senior Data Systems Engineer March 2005 - June 2016

● Automated mission-critical production components are used for every internal and external file transfer, which reduced manual oversight by 70% and enhanced monitoring for Card Services' production environment.

● Developed and deployed a highly configurable file routing and tracking system to transport virtually all inter- and intra-company files via Axway Secure Transport, which improved data delivery reliability by 19%.

● Enhanced a Ruby on Rails web application to automate requests from hundreds of technicians, saving thousands of hours annually and strengthening auditability.

● Built a knowledge base leveraging Peregrine Ticketing System data, which expedited staff training by 70% and improved issue diagnosis by 45%.

● Utilized a range of technologies (Perl, Java, Tivoli, SQL, Shell) to develop custom monitors and programs, providing real-time health and metrics data on card production systems. Wells Fargo (Insight Global) Philadelphia, PA December Data Systems Engineer,Jan 2018 - March 2018

● Enhanced secure file transfer capabilities by implementing and maintaining robust security protocols (PGP, SSH2, SSL) on Axway Secure Transport, ensuring data integrity and compliance.

● Configured network firewalls and managed certificate lifecycles, reducing incident response times by 20% through proactive security measures.

Coca-Cola(HPE/DXC) Remote Atlanta, GA

Software Data Engineer June 2016 - October 2017

● Directed the migration of the Middleware platform (Axway B2BI and CFT) to AWS cloud, achieving a potential savings of 66% and significantly improving scalability.

● Developed and delivered a real-time monitoring dashboard via Python, Java, and PowerShell APIs for Sentinel Web, which saved security teams 8-12 hours per day by enhancing threat visualization. MBNA, CSC, Basell March 1999 – March 2005

Systems Administration

● Managed and optimized enterprise batch job scheduling systems (converted Maestro TWS to Control-M), reducing job failures by 35% and improving reliability.

● Automated system tasks using UNIX shell scripting, significantly decreasing manual operational overhead by 80%. EDUCATION

PROJECTS

● Architected a scalable, decoupled data ingestion pipeline on AWS, leveraging SQS to buffer and queue incoming data for efficient batch processing on EMR. This robust design included PySpark-based data quality frameworks to detect missing fields, null values, and schema mismatches at scale.

● Developed and optimized real-time data processing solutions using PySpark jobs for deduplication and joins on streaming data from Kafka. This ensured data freshness and integrity, while custom PySpark UDFs in Hive and Spark extended complex data processing logic.

● Automated end-to-end infrastructure provisioning for data pipelines using Terraform, creating reusable modules for S3 buckets, IAM roles, EC2 instances, and Lambda functions. This streamlined environment setup significantly reduced deployment time by 5 days.

● Optimized EMR cluster configurations for compute-heavy Spark jobs, improving processing times by 3x. Further enhanced data accessibility by utilizing Redshift Spectrum to query external data stored in S3 directly, eliminating the need for data movement into the warehouse.

● Implemented proactive data monitoring by integrating SNS with S3 to trigger notifications (Slack/Email) for new data files, ensuring timely awareness. Additionally, automated creation of AWS Glue resources and crawler configurations using Terraform scripts streamlines data cataloging.



Contact this candidate