Post Job Free
Sign in

Data Engineer Senior

Location:
Austin, TX
Salary:
open
Posted:
September 11, 2025

Contact this candidate

Resume:

Manikanth Padigela

Data Engineer

*********.************@*****.*** 913-***-**** www.linkedin.com/in/manikanth-pad/ PROFESSIONAL SUMMARY

Senior Data Engineer with 4+ years of experience in Azure Cloud Services, Big Data, and Data Warehousing. Skilled in building scalable data pipelines using ADF, Databricks, Spark, and Snowflake, with expertise in ETL/ELT, Delta Lake, and Synapse Analytics. Proficient in real-time streaming (Kafka, NiFi, Spark Streaming) and CI/CD automation via Azure DevOps and GitHub. Strong hands-on skills in Python, PySpark, SQL, and Scala, with proven success in data modeling, governance, and optimization. Experienced in delivering business insights through Power BI and recognized for collaboration, problem-solving, and enterprise-grade cloud data solutions. TECHNICAL SKILLS

Azure Services

Azure Data Factory, Azure Databricks, Logic Apps, Azure Functions, Azure DevOps, Azure Synapse, Azure Gen2, Azure Migrate, Stream Analytics, Data Lake, Key Vault, Event Grid, Alteryx, Azure Monitor, Azure Purview, Apache Atlas Big Data Technologies

MapReduce, Hive, Teg, PySpark, Scala, Kafka, Spark streaming, Oozie, Sqoop, Zookeeper, Snowflake, Data Lake, HBase, Data pipelines,Maplotlib, Snowflake, AWS Lambda, AWS CloudWatch, TensorFlow, Data warehouse, Snowflake. Languages SQL, PL/SQL, Python, HiveQL, Scala, Java, Pandas, R, Bash Hadoop Distribution Cloudera, Horton Works, MapR

Operating Systems Windows (XP/7/8/10/11), UNIX, LINUX, UBUNTU, CENTOS, MAC Web Technologies HTML, CSS, JavaScript, XML, JSP, Restful, SOAP Version Control GIT, GitHub.

Build Automation tools Ant, Maven

Databases MS SQL Server 2016/2014/2012, Azure SQL DB, MS Excel, MS Access, Oracle 11g/12c, Cosmos DB, Postgres, MySQL, GCP Buckets, SharePoint. IDE & Notebooks Eclipse, Visual Studio, PyCharm, Jupyter, Databricks, Google Colab, Seaborn, Box Plots, Heatmaps, Correlation Matrix

WORK EXPERIENCE

Client: Tenet Healthcare, TX. Jan 2024 to present

Azure Data Engineer

Responsibilities:

Architected and optimized Azure Snowflake data warehouse solutions, integrating SQL, API, and file-based data through Azure Data Factory (ADF).

Built scalable ETL/ELT pipelines in ADF and Synapse with validation, cleansing, and transformation, ensuring high-quality data for analytics.

Developed and tuned Spark/Databricks jobs for batch and ML workloads, improving processing speed by 30%.

Designed real-time ingestion pipelines using Event Hubs, Azure Functions, Kafka, and Spark Streaming to handle high-volume data.

Implemented data governance and lineage with Azure Purview and Apache Atlas, strengthening compliance and audit readiness.

Automated reporting and dashboards with Power BI, Tableau, and Alteryx, reducing manual reporting by 40%.

Applied data security and archival strategies (Blob Storage, Time Travel) and optimized query performance with monitoring via Azure Monitor & Snowflake QPM.

Established CI/CD pipelines in Azure DevOps (TDD, versioning, automated deployments), improving release reliability.

Collaborated with DevOps engineers and stakeholders, delivering enterprise-grade cloud data engineering solutions and driving knowledge-sharing initiatives.

Client: Jefferies Financial Group, New york June 2022- June 2023 Data Engineer

Responsibilities:

Designed and deployed end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake, ensuring scalability and performance for financial data.

Developed ETL/ELT workflows to process structured and unstructured data from diverse sources (SQL, APIs, files) into Snowflake and Synapse.

Built real-time data streaming pipelines with Kafka, Spark Streaming, and Azure Event Hubs, enabling near real-time risk and compliance analytics.

Optimized Spark/Databricks jobs and SQL queries, improving query performance by 25%+ across critical reporting workloads.

Implemented data governance and lineage with Azure Purview and Apache Atlas to ensure compliance with financial regulations.

Automated reporting pipelines with Power BI and Tableau, reducing manual effort and accelerating time-to-insight for business stakeholders.

Applied data security, role-based access controls, and archival strategies to meet compliance and audit requirements.

Established CI/CD pipelines in Azure DevOps and GitHub for automated testing, versioning, and deployments.

Collaborated with cross-functional teams (Data Science, Compliance, Risk) to deliver trusted and actionable insights for trading and investment decisions.

Client : Liberty Mutual Insurance, India Feb 2021 to May 2022 Data Engineer

Responsibilities:

Built and maintained data pipelines using Sqoop, Flume, Kafka, Spark, and Hive to process structured, semi-structured, and unstructured data at scale.

Migrated large workloads from Oracle RDBMS to Hadoop and AWS Redshift, improving scalability and reducing processing costs.

Designed and developed ETL workflows with AWS Glue, Lambda, and PySpark to ingest and transform data into Redshift, enabling near real-time analytics with Kinesis, SNS, and SQS.

Optimized Hive queries, SparkSQL, and Redshift schemas (partitioning, caching, materialized views), reducing query times by 25–30%.

Automated legacy ETL pipelines, cutting maintenance overhead by 35% and improving system stability.

Delivered business insights by integrating Redshift with Power BI and enhancing dashboard performance.

Implemented data governance and lineage with AWS Glue Data Catalog and Apache Atlas for compliance and metadata tracking.

Spearheaded CI/CD improvements using AWS CodePipeline and Jenkins, accelerating deployments and reducing release errors.

Configured AWS monitoring (CloudWatch, Route53) for pipeline health and system stability, ensuring high availability.

Partnered with business teams and data scientists to deliver end-to-end AWS-based data solutions supporting strategic decision-making.

EDUCATIONAL QUALIFICATION

Master’ s in Computer Sciences, University of Central Missouri, Warrensburg, MO.

Bachelor' s in Computer Sciences, National Institute of Technology, India CERTIFICATIONS

Azure Data Engineer Associate -- Microsoft (LinkedIn)

Microsoft Certified: Azure Virtual Desktop specialty (LinkedIn)

Azure Network Engineer Associate -- Microsoft (LinkedIn)

Joy Of Computing In Python ---NPTL (LinkedIn)

Fundamentals of Deep Learning---NVIDIA(LinkedIn)



Contact this candidate