Shiva Priya Mudragadda
*****.**.*****@*****.*** +1-314-***-****
SUMMARY
Experienced Data Analyst with 6+ years of expertise in data processing, validation, and workflow optimization. I am skilled in applying data engineering principles to improve data quality and streamline processes. Proficient in analysing large datasets, ensuring accuracy, and delivering actionable insights. Expertise in SQL, Python, Excel, and data visualization tools (Power BI, Tableau) for automated reporting and decision-making. Passionate about enhancing operational efficiency through ETL pipelines, automation, and continuous process improvements.
Experience in end-to-end data pipeline implementation and support projects across multiple industries
Possess an excellent understanding of data integration processes from source to consumption
Expertise in cloud-based data platform development and optimization
Prepared technical documentation and process flows for data engineering solutions EDUCATION
Master of Science in Information Systems Management, Saint Louis University Aug 2023 – May 2025 Bachelor of Commerce in Computers, St. Ann’s college for Womens. SKILLS
• Programming & Scripting: Python (Pandas, NumPy, Matplotlib), SQL, Shell Scripting
• Databases & Data Warehousing: MySQL, Snowflake, Redshift, Big Query, Dat Lake
• Data Visualization: Tableau, Power BI, Looker, Excel (Pivot Tables, Charts, VBA)
• Data Engineering: ETL Pipelines, Apache Airflow, Data Modeling, Data Cleaning, Azure Data Factory, Databricks
• Big Data Technologies: Spark, Hadoop (basic knowledge), Pyspark
• Cloud Platforms: AWS (S3, Athena, Redshift), Azure (Synapse, Data Factory)
• Version Control & Automation: Git, Shell Scripting
• Business Intelligence & Reporting: KPI Tracking, Dashboard Development
• Machine Learning (Basic Knowledge): Regression, Clustering, Forecasting Soft Skills: Communication, Critical Thinking, Problem Solving, Time Management, Project Management, Teamwork, Presentation skills
EXPERIENCE
Infosys Limited, India Jan 2022–March 2023
Role: Data Engineer Responsibilities:
• Designed and implemented scalable data pipelines using PySpark and Databricks to process large volumes of railway operational data
• Developed and optimized ETL workflows in Azure Data Factory to integrate data from multiple transportation management systems
• Built Snowflake data warehousing solution with properly designed schemas to support logistics analytics
• Created automated data quality validation procedures to ensure data integrity across rail operations data
• Implemented incremental data loading patterns to optimize performance for time-sensitive logistics reports
• Developed Python scripts for complex data transformations to normalize transportation data formats
• Configured and maintained Azure Data Lake storage for structured and unstructured logistics data
• Utilized Azure DevOps for CI/CD pipeline implementation to automate code deployment
• Provided technical specifications for custom data solutions to support BNSF's operations
• Collaborated with transportation analysts to translate business requirements into technical solutions
• Created dashboards for monitoring data pipeline performance and railway operations metrics Excelra Knowledge Solutions Pvt Ltd, Hyderabad May 2019 to January 2022 Project1: EPSON Data Analytics Platform
Role: Data Engineer Responsibilities
• Developed end-to-end ETL processes using PySpark to integrate manufacturing and sales data.
• Designed and implemented Snowflake data warehouse schemas optimized for production and sales analytics.
• Built dimensional models and fact tables to support business intelligence and reporting needs.
• Created data pipelines in Azure Data Factory to process retail point-of-sale data efficiently.
• Applied partitioning and clustering strategies in Snowflake to enhance query performance.
• Developed Python utilities for metadata management and product catalog processing.
• Created stored procedures and Snowflake tasks for automated inventory reconciliation.
• Implemented data validation frameworks to ensure data accuracy and consistency.
• Optimized SQL queries to improve reporting performance for executive dashboards.
• Collaborated with business analysts to create KPI reports for manufacturing insights.
• Documented technical processes and prepared knowledge transfer materials. Project 2: Liang O'Rourke Construction Analytics Role: Data Engineer Responsibilities:
• Designed and built a scalable data lake architecture on Azure to store and process construction project data.
• Implemented Databricks workflows to analyse construction costs and project timelines.
• Developed PySpark jobs to process site management data with robust error handling.
• Created and optimized SQL scripts for project performance analytics and reporting.
• Applied incremental data loading patterns to enable real-time project status updates.
• Configured Snowflake external tables to query construction materials inventory efficiently.
• Orchestrated data pipelines using Azure Data Factory for milestone tracking and reporting.
• Implemented data quality checks to ensure accuracy in safety compliance reports.
• Developed Python scripts for predictive maintenance models on construction equipment.
• Created reusable code libraries for common data engineering tasks to enhance efficiency.
• Collaborated with project managers to develop data models supporting construction workflows.
• Documented data flows and processes between project management and analytics platforms. K2Infoedge Limited, Chennai
Role: Jr. Data Analyst Responsibilities:
• Wrote SQL queries to extract and analyse employee and customer data.
• Created Python scripts to automate data collection and basic reporting tasks.
• Assisted in building ETL pipelines to combine customer information from different sources.
• Designed and maintained simple dashboards to track business performance.
• Performed data quality checks to ensure accuracy and consistency.
• Automated regular reports using Python and SQL for internal reviews.
• Conducted basic data analysis to identify trends and patterns.
• Supported the marketing team by providing data for campaign analysis.
• Helped document data processes and maintained data records.