Post Job Free
Sign in

Cloud Data Engineer with AWS Expertise

Location:
Short Pump, VA, 23059
Posted:
January 26, 2026

Contact this candidate

Resume:

Contact Information

Richmond, VA *****

908-***-****

*************@*****.***

Skills

• AWS

• Python

• Pyspark

• AbInitio

• Glue

• Snowflake

• Teradata

• Postgres

• Redshift

• Athena

• Quicksight

• Java

• SQL

• Unix

• DB2

• Control-M Scheduler

• Arow Scheduler

• Data Pipeline

• CI/CD Pipeline

• ETL

• Data Warehouse

Languages

• English

Leela Bhavani Chandana

Data Engineer

Seasoned AWS Data Engineer with five years of optimizing computing performance and running data pipelines on the AWS cloud. Skilled in developing ETL processes, databases, data models, security protocols, and CloudFormation templates for all AWS environments. Over nine years of experience in using ETL tools to develop strategies for ETL mechanisms in complex and high-volume Data Warehousing projects. Work Experience

Data Engineer Jun 2025 - Present

Capital One - Richmond

• Developed automation processes for Hydra Application using Step Handlers and Global Parameter updates.

• Migrated Hydra automation from Arow to Redwood scheduling tool for efficiency.

• Enhanced FileWatcher scripts with AWS Secrets Manager to ensure source-file readiness before transformations.

• Created jobs for Data Pipeline / Cash Pipeline convergence and automated execution of Glue Jobs in Production.

• Generated daily and monthly reports for Federal Bank through automated pipelines, ensuring reliability and accuracy.

• Built lineage processes for Hydra datasets and automated data delivery to One Lake Application.

• Implemented CI/CD pipelines for deploying applications using container services, ECE infrastructure clusters, and ALB creation.

AWS Data Engineer Oct 2023 - Apr 2024

CCNC - NC

• Designed and developed data pipelines using PySpark and Python to process large datasets in a distributed environment, achieving a 30% reduction in processing time.

• Developed and maintained data processing and cleaning scripts using Python and PySpark.

• Created and maintained data pipelines using Glue workflow to schedule, monitor, and manage workflows.

• Conducted code reviews leading to improved code quality and faster delivery times.

• Created Postgres tables in local environment using Docker containers for health sector data loading.

• Optimized the Glue ETL Job for performance improvements in AWS environment.

• Worked with Redshift serverless database connector and queried health sector data using Redshift editor.

• Worked in machine learning - data preprocessing, simple linear regression, and multiple linear regression.

Data Engineer Oct 2019 - Sep 2023

Capital One - Richmond, VA

• Created CICD Pipeline for deploying application code using Container services and ECE Infrastructure Cluster and ALB creation for File Gateway Portal.

• Implemented Smart ops application for creating RDS Failover from East to West.

• Utilized AWS Glue mapping and transformations for ECE Jobs.

• Developed and maintained data processing and cleaning scripts using Python.

• Established data pipelines using Glue workflow with Pyspark data transformation to schedule, monitor and manage workflows

• Created Hash Vault secret storage and managed rotation of passwords.

• Implemented multiple flavors of SNS, CloudWatch, and Standalone lambda deployment using CICD Pipeline.

• Automated creation of AWS resources to avoid manual creation using CICD Pipeline.

• Collaborated with Infrastructure team to develop custom shell scripts for daily tasks like log collection and data cleanup.

• Upgraded Ab Initio graphs from Version 3 to Version 4.

• Configured Jenkins for Team Managed Pipeline to promote code from Lower environment to Production environment.

ETL Lead Jun 2018 - Sep 2019

Capital One - Richmond, VA

• Managed the Pipeline Migration Project by migrating Ab Initio graphs from on-premises environment to AWS.

• Collaborated with stakeholders to analyze data requirements and develop migration plan, including data transformations, cleansing, and validation.

• Extracted and transformed data from various source systems, ensuring data integrity and quality throughout migration.

• Implemented strategies for data accuracy, completeness, and consistency before, during, and after migration.

• Documented data migration processes, including mappings, transformations, validation rules, and error handling procedures.

• Developed a custom automation tool to streamline Development, QA, and Production processes.

• Provided post-migration support to ensure successful adoption of new systems and data integrity.

• Collaborated with team to migrate and test Ab Initio graphs in AWS environment.

• Added tokenization and detokenization components to Ab Initio graphs for encryption and decryption of PCI and NPI data.

ETL Lead Mar 2017 - Jun 2018

Visa - Palo Alto, CA

• Worked on the CMLS Token Range Delivery to DS4 project, developing Ab Initio graphs to deliver account range and token range information to DS twice a day.

• Developed graphs to load data into tables and validate the new data in comparison with already existing data.

• Implemented business requirements on existing graphs based on impact analysis, making necessary changes to achieve business objectives.

• Designed, developed, and tested Ab Initio graphs with a team.

• Coordinated with offshore team and led onsite team. ETL Lead Nov 2015 - Feb 2017

American Express - Phoenix, AZ

• Identified and pulled requests from GIDM tables using MSSU process to create reports for end users.

• Developed Ab Initio graphs with complex transformation rules using GDE to process and load data into databases.

• Created high-level and low-level design documents, unit test cases, and collaborated with quality control team to track defects.

• Implemented data parallelism through Ab-Initio graphs to process data segments simultaneously.

• Conducted code reviews, integration testing, and UAT to ensure quality and functionality of ETL processes.

Senior Programmer Analyst Jun 2010 - Oct 2015

American Express - Chennai, TamilNadu

• Modified the GDR to bring in new data elements from TSYS and CAR that are not currently available in the GDR.

• Included new data elements in the GDR from the CAR Client Centric Feed v3.

• Conducted unit testing and end-to-end system testing of the entire application.

• Developed and supported the ETL process for the Data Warehouse from heterogeneous source systems using Ab Initio.

• Utilized Ab Initio GDE to generate complex graphs for the ETL process using Join, Rollup and Reformat transform components and executed using Co>Operating System.

• Took up support activities during System Integration Testing and User Acceptance Testing Process along with implementation of various Master Card/Visa and other regulatory guidelines.

• The GDR is the single large data warehouse that stores/manages all the AMEX Corporate card transactions worldwide.

Education

Master of Science Computer Applications Mar 1998 - Jul 2001 GVR & S PG College for Women - Guntur

Certifications

• AWS Certified Solutions Architect Associate May 2024

• Professional Certification Program from IBM DB2 Fundamentals Dec 2013

• AWS Certified Solutions Architect - Associate (SAA-C03), Amazon Web Services (AWS), Credential ID: f65a6f13-e948-46e7-857b-71b381- 7320ae

May 2024



Contact this candidate