Post Job Free
Sign in

Data Engineer Analysis

Location:
Denton, TX
Posted:
September 10, 2025

Contact this candidate

Resume:

Akhila Molunguri

Email: ******.*********@*****.***

Mobile: +1-940-***-****

Data Engineer

PROFESSIONAL SUMMARY:

Data Engineer with 4+ years of experience, demonstrating analytical thinking and innovative problem-solving skills to deliver end-to-end solutions across diverse business applications, ensuring success as a team player.

Strong hands-on experience with data analysis using SQL and PL/SQL, writing and analyzing complex queries and stored procedures, while employing attention to detail and innovative thinking.

Proficient in connecting dots across various applications to understand the end-to-end view, working in Agile/Scrum teams, and prioritizing work with effective resource assignments.

Experienced with cloud platforms, implementing scalable architectures, and possessing strong communication and presentation skills to effectively convey technical details to both technical and non-technical audiences.

Automated KPI dashboards, enabling self-service reporting, and providing executive summaries of performance trends, while demonstrating expertise with Microsoft Office suite usage for data analysis.

Integrated distributed data sources into analytics platforms, enabling cross-system harmonization, and demonstrating the ability to effectively communicate across the organization at both detailed and executive levels.

Implemented anomaly detection frameworks within pipelines to flag quality issues, ensuring regulatory compliance, and exhibiting willingness to ask questions and reach out for assistance as required.

Designed optimized data models for faster insights and query performance, while demonstrating know-how in effort and financials estimation for projects and team success.

Led cost-saving initiatives by reducing reporting latency through automation with SQL procedures and scripts, identifying priorities, and managing multiple projects simultaneously with minimal supervision.

Engineered and deployed containerized data ingestion services, improving scalability and deployment speed, while working well in a team environment and influencing team members for success.

Delivered cross-functional KPI frameworks by collaborating with analytics and product teams, aligning metrics with strategic goals, and demonstrating strong analytical thinking and problem-solving skills.

Supported integration for hybrid cloud workloads, optimizing data access latency, and backup recovery times, while exhibiting attention to detail and innovative thinking in problem-solving.

Tuned queries, materialized views, and schema structures for high-volume datasets, reducing dashboard refresh times, and ensuring fast ad hoc querying performance for analysts, a strong team player.

Applied clustering and regression modeling techniques to customer and product data, supporting churn prediction, segmentation, and targeted marketing attribution use cases, with analytical mindset.

Enabled real-time analytics through integrations, ensuring near-instant access to streaming data for operational dashboards and alerting workflows, while prioritizing work effectively in Agile teams.

Established RBAC, audit logs, and lifecycle policies to support enterprise-grade security and compliance for sensitive data, demonstrating expertise with Oracle Exadata or 10g and above.

Built reusable data marts and semantic layers, supporting faster development of domain-specific dashboards, reducing time-to-insight for business teams, and working with minimal supervision.

Authored detailed documentation for pipeline logic, data lineage, metrics definitions, and dashboard usage guides to streamline onboarding and empower self-service analytics across departments.

TECHNICAL SKILLS:

Programming Languages - Python, SQL, T-SQL, PySpark, PL/SQL

Cloud Platforms - AWS (EKS, S3), Google Cloud Platform (BigQuery, Cloud Storage), Azure (familiar)

Data Warehousing & Databases - Snowflake, SQL Server, MySQL, Azure SQL DB, Oracle Exadata, Oracle 10g

ETL Tools - Airflow, Snowflake ETL, Python-based scripting

Data Visualization - Power BI, Tableau

Big Data & Frameworks - Spark (PySpark), Kubernetes, Rancher

Monitoring & DevOps - Splunk, NetApp, Sitecore, WSO2, Git

Operating Systems - Windows, Linux

Others - Microsoft Office suite

PROFESSIONAL EXPERIENCE:

National Marrow Donor Program (NMDP) Jan 2024 – Present

Data Engineer

Responsibilities:

Demonstrated analytical thinking by designing and deploying scalable ETL pipelines in Python and Snowflake, automating data ingestion from multiple health registry sources to streamline operations and ensure timely, accurate analytics delivery.

Utilized strong communication and presentation skills to build Power BI dashboards connected to AWS-hosted systems, enabling near real-time monitoring of registry operations across hybrid environments, including cloud and on-premises infrastructure.

Applied problem-solving skills to engineer and deploy containerized workloads using Rancher, Kubernetes, and AWS EKS, optimizing cluster performance, resource allocation, and deployment scalability across hybrid cloud setups.

Showcased attention to detail by developing Python-based anomaly detection logic and validation scripts embedded within pipelines, reducing downstream data defects and increasing trust in analytics across clinical teams.

Led Snowflake schema optimization initiatives, introducing clustering, indexing, and dimensional modeling practices to improve performance for complex analytical queries, demonstrating innovative thinking and technical expertise.

Automated monthly and quarterly reporting processes using Python, SQL, and Airflow, reducing manual work by 40% and enabling consistent, on-time data delivery to key stakeholders, showcasing prioritization skills.

Implemented data governance controls using RBAC in Snowflake and lifecycle policies in AWS S3, enforcing HIPAA compliance and improving access auditability for sensitive patient data, working well in a team environment.

Created metadata dictionaries, transformation logic maps, and lineage diagrams for all major pipelines to improve traceability, governance, and cross-functional collaboration, demonstrating strong PL/SQL skills.

Integrated Sitecore, SUSE, and WSO2 platform data into Snowflake via Python pipelines, creating unified datasets for cross-platform analytics and performance optimization, connecting dots across various applications.

Supported cloud migration from legacy platforms to AWS and Snowflake by validating data pipelines, running regression tests, and coordinating deployment readiness with stakeholders, asking questions and reaching out for assistance.

Best Buy Nov 2022 – Dec 2023

Data Engineer

Responsibilities:

Designed robust ETL pipelines using SQL and Airflow to ingest and transform sales and inventory data, enabling timely forecasting and improved decision-making for merchandising and supply chain teams, demonstrating analytical thinking.

Developed and deployed Power BI dashboards with interactive visuals and drilldowns, providing real-time visibility into customer behavior, sales trends, and product performance across all sales channels, showcasing attention to detail.

Engineered Snowflake transformation layers and dynamic reporting views, aligning business logic with evolving KPIs to support standardized metrics and consistent reporting across departments, utilizing strong PL/SQL skills.

Automated ingestion from point-of-sale systems and vendor data sources using Python scripts, reducing batch pipeline run times by 40% and improving processing throughput, demonstrating problem-solving skills.

Created high-performance stored procedures and parameterized queries in Snowflake to support real-time reporting and inventory reconciliation workflows with reduced execution latency, showcasing innovative thinking.

Integrated customer demographics, loyalty activity, and online behavior into Snowflake, enabling enriched segmentation and predictive modeling for marketing and retention strategies, connecting dots across applications.

Designed Power BI dashboards with SKU-level filters, DAX measures, and conditional logic to help inventory planners identify overstock, slow movers, and seasonal performance patterns, communicating to non-technical audiences.

Built churn prediction models using engineered features from transactional data, returns, and product affinity, supporting proactive customer retention and targeting strategies, working well in a team environment.

Standardized KPI layers in Snowflake using reusable UDFs and views, accelerating dashboard development and aligning metrics across analytics teams for unified insights, identifying priorities and managing multiple projects.

Established automated data lineage and audit tracking scripts in Snowflake, improving governance, data traceability, and reporting compliance across the analytics platform, asking questions and reaching out for assistance.

Tata Consultancy Services (TCS) Oct 2020 – Aug 2022

Data Analyst

Responsibilities:

Designed and maintained enterprise-grade reporting models in Power BI and SQL Server to support finance, HR, and logistics KPIs for global operations and business stakeholders, demonstrating analytical thinking.

Developed Python- and SQL-based ingestion workflows for ERP system data, embedding error handling, data validation, and alert mechanisms to support reliable daily data refreshes, showcasing attention to detail.

Migrated legacy Excel-based processes to Snowflake and Power BI, improving report scalability, access control, and versioning across multiple business units, utilizing strong communication and presentation skills.

Created optimized SQL procedures for financial reconciliation, reducing manual review effort and improving accuracy in billing and transaction alignment processes, demonstrating problem-solving skills.

Defined dimensional models and star schemas in collaboration with data architects, enabling faster Power BI dashboard queries and consistent reporting logic across teams, showcasing innovative thinking.

Tuned queries using indexing, partitioning, and materialized views, enhancing performance for high-volume reporting dashboards in production environments, connecting dots across various applications.

Ensured governance by building audit-ready ETL scripts with logging, and maintaining clear documentation of data flows and transformation logic, communicating to technical and non-technical audiences.

Identified and resolved data discrepancies by performing root cause analysis, profiling upstream sources, and implementing fixes within ingestion and transformation pipelines, working well in a team environment.

Automated high-frequency ad hoc reports using Python scripts and parameterized SQL templates, accelerating report generation and reducing turnaround time by 50%, identifying priorities and managing multiple projects.

Authored business glossaries and metadata dictionaries, enabling consistent understanding of KPIs, metrics, and data sources across cross-functional teams, asking questions and reaching out for assistance.

Certifications:

Microsoft Azure Data Engineer Associate

SQL Advanced SnowPro (Core)

Google Cloud Digital Leader (Planned Q4 2025 – Optional Add-on)

Educational Details:

Master of Science in Information Systems and Technology - University of North Texas



Contact this candidate