Madhava Mule – Data Engineer
Mail: ***********.***@*****.*** Phone: +1-210-***-**** LinkedIn
SUMMARY OF EXPERIENCE
• Around 5 years of experience as a Data Engineer supporting large-scale data platforms across banking, technology, and financial services.
• Strong expertise in SQL, Python, Azure, Snowflake, ETL/ELT pipelines, Power BI, Databricks, and cloud data engineering.
• Hands-on experience in building scalable data pipelines, data modeling, performance tuning, and cloud-based analytics solutions.
• Proficient in Azure Data Factory, ADLS, Blob Storage, Snowflake migrations, and automation frameworks.
• Experienced in Power BI dashboard development, DAX, forecasting models, and integrating Python into analytics workflows.
• Skilled in Hadoop, Hive, Spark, and Kafka for big-data processing and distributed systems.
• Strong collaboration abilities working with SMEs, analysts, program managers, and engineering teams.
TECHNICAL SKILLS
• Programming & Scripting: SQL (Advanced), PL/SQL, Python, Shell Scripting, SnowSQL
• Databases & Warehousing: Snowflake, MS SQL Server, Oracle, Teradata, PostgreSQL, Star/Snowflake Schema, SCD Types
• Cloud & Big Data: Azure (ADF, ADLS, Blob Storage, Databricks, Data Lake, Delta Lake), AWS S3, Azure Synapse, Spark, Hadoop, Hive, Kafka
• ETL & Data Integration: Azure Data Factory, Databricks, REST API Integration, Informatica (PowerCenter/IDQ), SSIS
• BI & Reporting: Power BI (DAX, Modeling), Tableau, SSRS, Excel (Power Query, PivotTables)
• Automation & CI/CD: Azure DevOps, Git, Jenkins (exposure), Docker (containerized data workflows), Kubernetes (working knowledge)
• Tools & Platforms: Git, Azure DevOps, JIRA, ServiceNow, TOAD, Teradata SQL Assistant, JSON, XML, Windows, Linux, Unix
PROFESSIONAL EXPERIENCE
Data Engineer CloudPath Technologies LLC Irving, TX February 2025 – Present
• Designed, developed, and deployed enterprise-grade Power BI reports by integrating datasets from ADLS, Tabular Models, and SQL Server, ensuring optimized performance and secure access.
• Built advanced interactive dashboards with slicers, filters, KPIs, drill-throughs, and custom DAX measures to support enterprise analytics.
• Led Snowflake regional data migration between accounts, ensuring schema replication, data movement, validation, and reconciliation.
• Developed complex SQL queries for extraction, transformation, cleansing, validation, and automated quality checks.
• Built ETL pipelines using Python and SAS to ingest structured and semi-structured data from transactional systems.
• Created Snowflake monitoring dashboards tracking warehouse usage, query performance, and resource consumption.
• Developed Databricks notebooks for transformation and validation, and parameterized ADF pipelines for reusable ETL workflows.
• Configured On-Premises Power BI Gateway enabling hybrid scheduled refreshes and secure system connectivity.
• Managed ADF environments including linked services, triggers, staging, archival, and Blob lifecycle automation.
• Performed SQL-based deep-dive analysis identifying trends, anomalies, and business insights.
Data Engineer Northern Arizona University Flagstaff, AZ October 2023 – December 2024
• Engineered scalable ingestion and transformation pipelines using Azure Data Factory for ADLS and Azure Data Catalog.
• Performed comprehensive ETL testing including schema validation, row-count checks, referential checks, and load accuracy verification.
• Analyzed investment and trading datasets using MS SQL and Power BI, validating metrics and delivering insights for academic research.
• Integrated Python forecasting models into Power BI dashboards for automated time- series predictions.
• Optimized Teradata SQL includes index tuning, join optimization, workload management, and unit test coverage.
• Worked closely with SMEs to define business logic, KPIs, and required metrics for reporting models.
• Led JAD sessions, documented detailed requirements, and converted them into actionable user stories and epics.
• Developed Python-based custom parsers for semi-structured data transformation and workflow automation.
• Enhanced Azure cloud applications supporting enterprise-wide operational and analytical systems.
• Developed PL/SQL triggers, stored procedures, and functions supporting core backend processing.
Software Developer CSJ Technologies Hyderabad, India January 2021 – August 2023
• Developed backend automation scripts to extract data from REST APIs and load into PostgreSQL staging layers.
• Performed detailed data profiling, detecting inconsistencies, duplicates, missing values, and applying validation checks.
• Developed exception-handling logic including retry workflows, logs, and notification triggers.
• Supported Change Data Capture (CDC) processes ensuring synchronized updates across systems.
• Developed SQL and PL/SQL scripts implementing data transformation rules and backend logic.
• Implemented Slowly Changing Dimension (SCD) methods to track historical updates across systems.
• Performed unit testing for backend workflows, validating correctness before deployment. EDUCATION
Master of Science — Northern Arizona University
Bachelor of Technology — Jawaharlal Nehru Technological University (JNTU-K)