Post Job Free
Sign in

Senior Data Engineer with Cloud & Analytics Expertise

Location:
US 19 Corridor, FL, 34652
Posted:
April 30, 2026

Contact this candidate

Resume:

SRISAIGANESH MOTUKURI

Email: **********@*****.***

Mobile: 203-***-****

LinkedIn: www.linkedin.com/in/srisaiganesh-motukuri/ Senior Data Engineer

PROFESSIONAL SUMMARY

Data Engineer with 5+ years of experience designing scalable data pipelines, cloud-based data platforms, and AI- ready data solutions across enterprise environments.

Hands-on experience with Azure, AWS, and GCP, building batch and streaming data workflows that support analytics, reporting, and machine learning initiatives.

Strong expertise in Python, SQL, Spark, Airflow, Databricks, Snowflake, BigQuery, and Redshift for data integration, transformation, modeling, and optimization.

Proven ability to deliver governed, high-quality datasets by improving data quality, metadata, lineage, observability, and production reliability for business-critical applications.

Facilitated team collaboration through excellent written oral communication skills, enhancing project efficiency by 20%.

Led initiatives with passion automation continual process improvement, resulting in a 30% reduction in operational costs.

Enhanced data integration processes by implementing Informatica within Agile methodology, resulting in a 30% increase in project delivery speed and improved team collaboration. TECHNICAL SKILLS

Programming & Scripting - Python, SQL, Bash, PowerShell, Perl

Data Engineering - AWS Glue, Azure Data Factory, DBT, Apache Kafka, Spark, Hive, GCP Dataflow

Cloud Platforms - AWS (EC2, Lambda, Glue, S3, Kinesis, IAM, EKS, Redshift), Azure (ADF, Synapse, Azure SQL, Entra ID, Key Vault), GCP (BigQuery, GKE, Cloud Storage), Microsoft Fabric

Infrastructure as Code (IaC) - Terraform, Ansible, ARM Templates, Bicep, CloudFormation, Jenkins, Azure DevOps

Databases - Redshift, Snowflake, Azure SQL, PostgreSQL, MongoDB, MySQL, Oracle, Oracle Exadata

CI/CD and DevOps - Jenkins, GitHub Actions, Git, GitLab, CodePipeline, CI/CD Pipelines, Shell Scripting

Dashboards and Visualization - Power BI, Tableau, Looker, AWS QuickSight

Security and Compliance - IAM, Encryption, NIST 800-53, CIS Benchmarks, PCI-DSS, RBAC, Key Vault, Audit Logging

Monitoring and Incident Response - New Relic, AWS CloudWatch, Azure Monitor, ServiceNow, RCA, SLA Management

Data Integration - Informatica

Operating Systems - Linux-based processes, Unix file systems PROFESSIONAL EXPERIENCE

Bank of America July 2025 – Present

Senior Data Engineer

Architected Azure Data Factory and Databricks pipelines integrating batch and streaming sources into governed lakehouse layers, improving data availability for AI, reporting, and compliance.

Engineered Python and SQL frameworks on Azure to transform curated datasets, strengthen lineage and metadata standards, and support reliable machine learning feature delivery workflows.

Optimized Spark and Snowflake workloads on Azure, reducing orchestration friction, improving schema consistency, and enabling resilient downstream consumption across analytics and business teams enterprise-wide.

Integrated APIs, event streams, and enterprise data sources through ETL and ELT pipelines, increasing trusted data access for risk, finance, and operational stakeholders globally.

Standardized CI/CD, data quality checks, and monitoring across Azure pipelines, improving production reliability, accelerating issue resolution, and supporting secure enterprise data operations daily organization-wide.

Executed Linux environment setup utilizing Unix file systems and Perl, resulting in a 30% reduction in deployment time.

Optimized Linux-based processes and Unix file systems with orchestration tools, improving system reliability by 25%.

Enhanced Oracle Exadata and Oracle data warehouses, achieving a 40% increase in query performance. Blue Cross Blue Shield Association March 2024 – June 2025 Data Engineer

Configured AWS Glue, S3, and Redshift pipelines to ingest healthcare datasets, strengthening governed storage, scalable transformations, and dependable access for analytics consumers daily organization-wide.

Validated batch and streaming workflows with Python, SQL, and Airflow on AWS, improving data quality, lineage visibility, and operational support across regulated environments consistently.

Automated Redshift and Snowflake transformations for curated healthcare domains, enabling reusable datasets, consistent schema management, and faster delivery for reporting and downstream applications enterprise-wide.

Analyzed source system rules and metadata requirements, implementing AWS data models and runbooks that improved documentation, governance alignment, and production readiness across teams consistently.

Orchestrated AWS Lambda and Kafka integrations for enterprise pipelines, increasing processing reliability, reducing manual intervention, and supporting secure exchange of critical healthcare data consistently.

Streamlined data flows using mount types and standard tools, leading to a 20% boost in data processing efficiency.

Advanced Data Warehousing through Oracle development, resulting in a 35% increase in data retrieval speed. Walmart April 2020 – August 2023

Data Engineer

Designed BigQuery and Google Dataflow pipelines on GCP, unifying batch and streaming datasets to power analytics, dashboards, and AI-ready data products consistently organization-wide reliably.

Implemented Pub/Sub, Cloud Composer, and dbt workflows on GCP, improving orchestration, data quality, and reusable transformations for business intelligence and advanced analytics teams enterprise-wide.

Developed Vertex AI-ready datasets with Python and SQL, enabling scalable feature preparation, performant analytics access patterns, and dependable support for machine learning initiatives enterprise-wide.

Streamlined BigQuery data models and performance tuning for analytics workloads, increasing query efficiency, strengthening observability, and improving consumption across reporting teams consistently organization-wide daily.

Established GCP governance, CI/CD, and documentation standards across pipelines and dashboards, improving deployment consistency, stakeholder transparency, and trusted analytics delivery at scale globally consistently.

Improved system security by managing permissions and pipes, enhancing data integrity by 15%.

Developed robust architecture with Perl, increasing system scalability by 50%. EDUCATION

Master's in Business Analytics - Sacred Heart University

Bachelor's in Computer Science - Jawaharlal Nehru Technological University



Contact this candidate