SYED AYAZ AHMED
Email: ****************@*****.***
Mobile: 224-***-****
LinkedIn: www.linkedin.com/in/ayaz-syed-372009238
Senior Data Engineer
PROFESSIONAL SUMMARY
Data Engineer with 5+ years delivering modern large-scale enterprise pipelines, translating requirements into governed datasets, resilient orchestration, and reliable analytics for diverse stakeholders worldwide.
Experienced designing Azure, AWS, and GCP solutions, integrating batch and streaming sources, automating cost- efficient CI/CD, and enforcing repeatable security controls for regulated platforms consistently.
Proficient optimizing SQL, BigQuery, Redshift, and PostgreSQL workloads, implementing validation, lineage, metadata standards, and reconciliation checks that improve data quality and operational trust significantly.
Adept collaborating with product, risk, and operations teams, defining SLAs, monitoring, incident response, documentation, and runbooks that strengthen reliability and accelerate decision-making outcomes consistently.
Facilitated team meetings using excellent written oral communication skills, enhancing collaboration and project alignment.
Implemented innovative solutions with passion automation continual process improvement, boosting productivity and efficiency.
Enhanced system reliability by designing architecture and optimizing data flows using Oracle, resulting in a 30% improvement in data processing efficiency.
Improved project delivery speed by applying Agile methodology and fostering a deep Understanding among team members, resulting in a 20% reduction in development cycle time. TECHNICAL SKILLS
Cloud Platforms - AWS (EC2, Lambda, Glue, S3, Kinesis, IAM, EKS, Redshift), Azure (ADF, Synapse, Azure SQL, Entra ID, Key Vault), GCP (BigQuery, GKE, Cloud Storage), Microsoft Fabric
Monitoring and Incident Response - New Relic, AWS CloudWatch, Azure Monitor, ServiceNow, RCA, SLA Management
Infrastructure as Code (IaC) - Terraform, Ansible, ARM Templates, Bicep, CloudFormation, Jenkins, Azure DevOps
Security and Compliance - IAM, Encryption, NIST 800-53, CIS Benchmarks, PCI-DSS, RBAC, Key Vault, Audit Logging
CI/CD and DevOps - Jenkins, GitHub Actions, Git, GitLab, CodePipeline, CI/CD Pipelines, Shell Scripting Programming & Scripting - Python, SQL, Bash, PowerShell
Databases - Redshift, Snowflake, Azure SQL, PostgreSQL, MongoDB, MySQL, Oracle, Oracle Exadata
Dashboards and Visualization - Power BI, Tableau, Looker, AWS QuickSight
Data Engineering - AWS Glue, Azure Data Factory, DBT, Apache Kafka, Spark, Hive, GCP Dataflow, Informatica, Airflow
Programming Languages - Perl
System Administration - Linux-based processes, Unix file systems, Linux environment setup, mount types, permissions, standard tools, pipes
PROFESSIONAL EXPERIENCE
UnitedHealth Group June 2024 – Present
Senior Data Engineer
Architected Azure Data Factory pipelines and Databricks notebooks landing multi-source feeds into ADLS, enforcing schema validation and accelerating consulting dashboards for executive stakeholders weekly.
Automated Azure DevOps CI/CD with Terraform and policy checks, enabling repeatable releases for data services and reducing environment drift across development, test, and production.
Engineered SQL transformations with DBT and PostgreSQL staging, implementing CDC reconciliation, audit fields, and robust rollbacks that improved traceability and satisfied compliance reviews significantly.
Standardized metadata, lineage, and access controls within catalog workflows, aligning HIPAA and GDPR requirements while strengthening quarterly entitlement attestations and evidence for sensitive datasets.
Optimized Power BI semantic models and DAX measures over curated marts, improving query performance and enabling analysts to deliver consistent KPI reporting across practices.
Enhanced system/architecture improvements by leveraging Linux-based toolsets for Data Warehousing, resulting in 30% reduction in data retrieval latency.
Optimized ETL/database load/extract processes for data warehouses using Perl, achieving a 25% increase in data processing efficiency.
Executed Linux environment setup with a Backend Focus on Unix file systems, improving system reliability by 40%.
Streamlined jobs processes on Oracle Exadata and relational databases, enhancing data throughput by 35%. Capital One June 2022 – May 2024
Data Engineer
Orchestrated AWS Glue crawlers and PySpark jobs on EMR, loading transaction datasets to S3 and supporting near-real-time risk analytics for multiple business units daily.
Migrated legacy SSIS workflows to serverless ELT with Lambda, Step Functions, and DBT, simplifying operations and enabling scalable backfills during peak processing windows reliably.
Hardened ingestion services with IAM, KMS encryption, VPC endpoints, and secrets rotation, reducing exposure and passing recurring security assessments for regulated customer information annually.
Integrated Redshift models with SQL data quality gates and automated tests, preventing bad loads and improving trust for finance reconciliations and fraud detection workloads.
Instrumented automated monitoring with CloudWatch dashboards, alarms, log insights, and incident runbooks, shortening triage time and sustaining strict SLA commitments for critical pipelines proactively.
Implemented Linux-based processes with scripts on Unix file systems, reducing system downtime by 20%.
Demonstrated Practical working experience and Willingness take problems, achieving a 50% improvement in issue resolution time.
Utilized Oracle and Perl to automate repetitive tasks, resulting in a 30% reduction in manual workload.
Configured mount types using standard tools, improving data access speed by 15%. Deloitte October 2019 – March 2021
Data Engineer
Established GCP BigQuery datasets and Dataflow pipelines ingesting HL7 and FHIR messages, standardizing healthcare events and enabling reliable clinical analytics consumption enterprise-wide consistently securely.
Validated PII handling through tokenization and access policies, meeting HIPAA controls while allowing approved analysts to explore de-identified longitudinal patient datasets securely daily compliantly.
Analyzed ELT performance using BigQuery execution metrics, partitioning, clustering strategies, and query tuning, reducing scan costs and improving dashboard latency for operations leaders monthly.
Refined semantic layers for Power BI and SSRS reports, aligning KPI definitions and improving consistency across revenue cycle, coding, and denial management teams organization-wide.
Streamlined data quality workflows with automated reconciliation, exception queues, and audit trails, increasing confidence in claims submissions and reducing rework across teams significantly overall.
Integrated ETL tools with orchestration tools, increasing data pipeline efficiency by 40%.
Applied Practical knowledge to configure systems, enhancing operational stability by 30%.
Managed permissions and pipes to streamline data flow, achieving a 20% increase in system throughput.
Leveraged Airflow and Informatica to automate workflows, resulting in a 50% reduction in processing time. EDUCATION
Master's in Information Technology Management - Lindsey Wilson College
Bachelor's in Electronics and Communication Engineering - Osmania University