Post Job Free
Sign in

Senior Data Analyst with 6+ Years Experience

Location:
Memphis, TN
Salary:
$85000/Year
Posted:
January 29, 2026

Contact this candidate

Resume:

VENKATA LAKSHMI MADHURI YARLAGADDA

Sr Data Analyst

+1-901-***-****

******************@*****.***

https://www.linkedin.com/in/lakshmi-yarlagadda-920a42349/

PROFESSIONAL SUMMARY

•6+ years of experience as a Data Analyst / Data Modeler delivering end-to-end analytics and reporting solutions across healthcare, retail, manufacturing, and IT domains, with a strong focus on data-driven decision support.

•Extensive experience collecting, cleaning, validating, and organizing large datasets from multiple sources, including healthcare EHR, claims, scheduling, and operational systems, ensuring high data quality, accuracy, and consistency.

•Proven ability to identify trends, patterns, correlations, and anomalies using SQL, statistical techniques, and analytical frameworks to generate actionable insights for clinical, operational, and business stakeholders.

•Strong hands-on experience developing interactive dashboards and reports using Tableau and Power BI, supporting clinical operations, patient throughput, staffing efficiency, revenue cycle, and executive decision-making.

•Advanced proficiency in Microsoft Excel (Pivot Tables, Power Query, advanced formulas, automation) for ad hoc analysis, reconciliation, and business reporting.

•Designed and delivered clear, concise dashboards and KPI-driven reports translating complex healthcare and enterprise data into insights for both technical and non-technical audiences.

•Hands-on experience querying and managing relational databases using SQL, building reusable queries, views, and curated datasets to enable self-service analytics.

•Applied Python (Pandas, NumPy) for data cleansing, exploratory analysis, automation, and basic statistical modeling to support healthcare and enterprise analytics use cases.

•Performed data validation, reconciliation, and integrity checks across patient encounters, claims, provider data, and operational metrics to ensure trusted and reliable reporting.

•Supported healthcare quality and regulatory reporting, including preparation of datasets for HEDIS, CMS, STARS, and internal clinical performance tracking.

•Collaborated closely with clinicians, operational leaders, revenue cycle teams, and business users to gather requirements, define KPIs, and align analytics outputs with real-world workflows.

•Assisted cross-functional teams in data-driven decision-making by presenting insights through dashboards, reports, and stakeholder presentations.

•Experience supporting analytics initiatives, dashboards, and reporting projects in both production and project-based environments, including healthcare-focused engagements.

•Strong analytical, critical thinking, and problem-solving skills with high attention to detail and the ability to manage multiple priorities effectively.

•Excellent written and verbal communication skills, enabling effective data storytelling and stakeholder engagement at multiple organizational levels.

•Familiarity with statistical concepts and exposure to tools such as Python and SAS for descriptive and exploratory data analysis.

•Committed to HIPAA-compliant data handling and maintaining confidentiality of PHI while working with sensitive healthcare data.

•Actively stay current with industry trends, best practices, and emerging analytics tools to continuously improve data processes and reporting outcomes.

TECHNICAL SKILLS

Big Data Technologies

Hadoop, Spark, and Kafka for large-scale data processing.

Programming Languages

Python, PL/SQL, SQL, Scala, C, C++, T-SQL, Power Shell Scripting.

Data Warehousing

Data Modeling, ETL Processes, Data lakes and Dimensional Modeling

Cloud Services

Amazon S3, AWS EMR, Lambda, GCP Big Query, Dataflow, Pub/Sub, and Cloud Storage, Azure Data Lake Storage Gen 2, Azure Data Factory, Azure Synapse Analytics, Databricks, Azure Event Hubs, Azure SQL Database.

Databases

MySQL, SQL Server, Oracle and Snowflake

NoSQL Data Bases

MongoDB, Cassandra DB, HBase

Visualization & ETL tools

Tableau, Informatica, Talend, SSIS, Power BI and SSRS

Version Control & Containerization tools

Jenkins, Git, and SVN

Operating Systems

Unix, Linux, Windows, Mac OS

PROFESSIONAL EXPERIENCE

Client: Costco, Remote Jan 2025 – Present

Role: Data Modeler/Analyst

Roles & Responsibilities

•Collected, cleaned, validated, and organized large datasets from HR, payroll, merchandising, inventory, and store operations systems to prepare high-quality data for analysis and reporting.

•Collaborated closely with business stakeholders and user groups to understand data requirements, define KPIs, and translate business needs into scalable analytics and reporting solutions.

•Designed and deployed DDLs from ERWIN physical data models to Azure SQL and on-premise SQL Server databases, ensuring accurate, consistent, and reliable retail data structures.

•Identified trends, patterns, correlations, and anomalies in labor, inventory, sales, and operational data using SQL, Excel, and analytical techniques.

•Developed enterprise-grade dashboards and visualizations using Tableau to track labor costs, staffing levels, turnover trends, workforce KPIs, and operational performance metrics.

•Built interactive dashboards and reports for inventory accuracy, SKU performance, vendor compliance, purchase order lifecycle, and supply chain efficiency to support data-driven decision-making.

•Created store-level dashboards highlighting sales trends, foot traffic, scheduling efficiency, and workforce productivity, enabling leadership to make informed operational decisions.

•Consolidated and integrated data from Workday, Kronos, SAP, Oracle Retail, POS, and other source systems into centralized datasets for unified analytics and reporting.

•Automated recurring Excel- and Tableau-based reports for daily store reviews and weekly leadership meetings, improving reporting efficiency, timeliness, and consistency.

•Ensured data accuracy and consistency by performing validation checks, reconciliations, and quality audits across multiple data sources and reporting layers.

•Designed analytical data models and structured data marts in Snowflake and SQL Server to support labor forecasting, payroll insights, headcount planning, and operational KPI tracking.

•Developed reusable SQL queries, views, and curated datasets to enable self-service analytics for analysts and business users.

•Conducted detailed statistical and trend analyses on labor utilization, overtime, staffing efficiency, SKU profitability, and vendor performance to identify cost-saving and optimization opportunities.

•Leveraged Microsoft Excel extensively (Pivot Tables, formulas, Power Query) and Python for ad-hoc analysis, data cleansing, automation, and exploratory analysis.

•Evaluated operational efficiency across store processes such as stocking, scheduling, and replenishment, providing actionable, data-backed recommendations to stakeholders.

•Presented clear and concise insights to both technical and non-technical audiences through dashboards, reports, and stakeholder presentations.

•Documented data lineage, metric definitions, transformation logic, and dashboard specifications to ensure governance, maintainability, and compliance.

•Stayed current with data analytics best practices, visualization standards, and emerging tools to continuously improve reporting quality and analytical outcomes.

Environments: Snowflake, SQL, Power BI, DAX, SQL Server, Python, ETL Pipelines, Data Modeling (Fact/Dimension), Real Estate KPIs, Forecasting & Variance Analysis, Agile

CLIENT: Optum labs Duration: May 2023- August 2024

ROLE: Data Analyst (Intern)

Responsibilities:

•Assisted in building data pipelines using Tableau Prep, SQL, and Python to clean, transform, and prepare healthcare datasets for analysis and reporting.

•Supported development of dashboards and reports for clinical operations, patient throughput, staffing efficiency, and revenue cycle monitoring.

•Created and updated Tableau visualizations to track healthcare KPIs such as Length of Stay (LOS), readmissions, ED utilization, and appointment volumes under guidance from senior analysts.

•Helped maintain standard reporting dashboards for hospital leadership, including daily census, bed availability, and unit-level performance trends.

•Assisted in preparing provider productivity, scheduling capacity, and patient access reports to support operational decision-making.

•Supported the creation of drill-down dashboards, enabling users to view performance by facility, service line, and provider.

•Helped structure and validate analytical datasets related to patient encounters, claims activity, provider rosters, and care team assignments.

•Assisted in integrating data from EHR, claims, lab, and scheduling systems, working with structured and semi-structured healthcare data.

•Supported data modeling and historical trend analysis for patient demographics, payer coverage, and provider credentials.

•Assisted in preparing datasets for HEDIS, STARS, CMS, and other regulatory or quality reporting requirements.

•Wrote and maintained SQL queries to analyze encounter histories, visit volumes, and basic reimbursement patterns.

•Performed data validation and integrity checks on patient identifiers, encounter data, and provider information to ensure reporting accuracy.

•Assisted in identifying data discrepancies and worked with ETL teams to resolve missing, duplicate, or incorrectly transformed records.

•Supported analysis of claims and billing data to help identify denial trends, coding gaps, and revenue-impacting issues.

•Helped analyze scheduling and operational data to identify appointment gaps, no-show trends, and workflow inefficiencies.

•Provided analytical support to nursing and operational teams by preparing staffing, workload, and utilization reports.

•Assisted in validating data extracts from Epic/Clarity, ensuring consistent mapping of encounters, labs, procedures, and orders.

•Supported quality checks on diagnosis and procedure codes to ensure alignment with ICD/CPT/HCPCS standards.

•Documented basic data definitions, KPI logic, and report assumptions under supervision.

•Collaborated with analysts, clinicians, and operational stakeholders to gather requirements and respond to ad hoc reporting requests.

•Participated in team meetings and cross-functional workgroups to understand reporting standards and healthcare analytics workflows.

•Followed HIPAA-compliant data handling practices, always maintaining strict confidentiality of PHI.

Environment: SQL, Snowflake, Azure SQL DB, Tableau, Power BI, Python (Pandas, NumPy, Matplotlib, Seaborn), Tableau Prep, Azure Data Factory, Excel (Advanced), SSIS, Databricks, PySpark, PostgreSQL, Google BigQuery, Looker, Alteryx, Jupyter Notebook, Power Automate, Azure Synapse, AWS Redshift, Git, Jira, Confluence, Agile.

Client: NTT DATA Services, Hyderabad, India May 2020 - Dec 2022

Role: Data Analyst

Roles & Responsibilities

•Collected, cleaned, processed, and validated large volumes of production, operational, and IoT sensor data to prepare high-quality datasets for analysis and reporting.

•Collaborated with business stakeholders, plant operations, and engineering teams to understand data requirements and translate them into effective analytical and visualization solutions.

•Identified trends, patterns, correlations, and anomalies across machine performance, production throughput, quality metrics, and supply chain data using analytical and statistical techniques.

•Developed and maintained interactive dashboards and visualizations using Tableau Creator to present KPIs, trends, geospatial data, and operational insights for data-driven decision-making.

•Designed dashboards to monitor Overall Equipment Effectiveness (OEE), production downtime, cycle times, scrap rates, defect frequencies, yield percentages, and First Pass Yield (FPY).

•Integrated and consolidated data from ERP (SAP, Oracle), MES systems, SQL Server, AWS Redshift, and flat files into unified reporting layers for consistent analytics.

•Utilized Tableau Prep to clean, normalize, and standardize plant- and supplier-level data, ensuring accuracy, consistency, and comparability across locations.

•Conducted root-cause analysis of production and quality issues using Tableau drill-downs, historical trend analysis, control charts, and operational metrics.

•Built real-time and near-real-time dashboards to monitor equipment performance, production flows, and sensor feeds, enabling proactive operational monitoring.

•Performed anomaly detection on IoT sensor data and operator logs to identify deviations, data quality issues, and early warning indicators of equipment failures.

•Analyzed seasonal demand patterns and production forecasts to support capacity planning, scheduling optimization, and throughput improvements.

•Developed predictive maintenance dashboards using historical equipment failure data and time-series analysis to reduce downtime and improve machine efficiency.

•Visualized supply chain bottlenecks, lead times, shipment delays, inventory turnover, and supplier performance metrics to support procurement and logistics decisions.

•Leveraged Microsoft Excel for ad-hoc analysis, data validation, and reporting, and used SQL extensively for querying, joining, and managing analytical datasets.

•Ensured data accuracy and reliability through reconciliation checks, validation rules, and continuous monitoring of data quality.

•Presented clear, concise insights to both technical and non-technical stakeholders through dashboards, reports, and executive-level presentations.

•Trained plant supervisors, line managers, and business users on dashboards, KPIs, and self-service analytics to drive adoption and effective decision-making.

•Defined and standardized KPIs, metric definitions, and data nomenclature across plants to eliminate inconsistencies and improve reporting clarity.

•Documented data lineage, transformation logic, dashboard specifications, and metric definitions to support governance, audit readiness, and compliance (ISO, OSHA, environmental standards).

•Stayed current with analytics best practices, visualization standards, and emerging data tools to continuously enhance reporting quality and analytical outcomes.

Environments: SQL, Python, Power BI, AWS S3, AWS Data Lake, Amazon Redshift, Teradata, SAP BW/BI, JIRA, Excel, Data Quality & Reconciliation, Root-Cause Analysis, Agile

CLIENT: Global Edge software, India April 2019-Mar 2020

Role: Python Developer

Responsibilities:

Developed robust, scalable, and secure web applications using Python and the Django framework, following MVC architecture and best development practices.

Designed and implemented RESTful APIs to enable seamless communication between frontend and backend systems, ensuring high performance and reliability.

Enhanced user experience by applying JavaScript and jQuery for dynamic content, interactive forms, and client-side validations.

Created responsive and accessible web pages using HTML5 and CSS3, adhering to web development standards and best practices.

Implemented SEO strategies within web applications to improve search engine rankings and increase site visibility.

Developed SQL queries, stored procedures, and database scripts to retrieve, update, and manage application data efficiently.

Designed and implemented Django Forms for structured data input, validation, and error handling, improving data accuracy and user experience.

Conducted data analysis using Python (Pandas, NumPy) to extract insights from application and operational data, supporting business decision-making.

Maintained and optimized databases to ensure data integrity, high availability, and efficient query performance under high-load conditions.

Participated in web debugging and troubleshooting using Firebug, browser developer tools, and logs to resolve performance and functionality issues.

Conducted unit testing to validate the functionality of individual components, ensuring code reliability and adherence to requirements.

Performed integration testing to verify seamless interaction between different modules and services within the application.

Collaborated with cross-functional teams, including designers, QA, and product owners, to identify issues, gather feedback, and implement improvements.

Optimized application performance through code refactoring, query optimization, and caching strategies to reduce page load times and enhance user experience.

Implemented authentication and authorization mechanisms using Django’s built-in tools to manage user roles and secure sensitive data.

Documented application architecture, database schemas, API endpoints, and workflows to support maintainability, knowledge sharing, and future development.

College projects- Masters

Project: Customer churn prediction, 2024

Handled missing values, outliers, and skewed numerical variables (tenure, monthly charges, total charges).

Transformed and created new features from categorical variables (gender, contract types, payment methods).

Conducted initial data exploration using descriptive statistics and visualizations.

Applied t-tests, ANOVA, and Chi-squared tests to assess the significance of predictors related to churn.

Built and optimized a linear regression model to predict total charges.

Evaluated model performance with metrics such as Adjusted R and Test R, showing high predictive power.

Identified issues like class imbalance, singularity, and non-significant categorical predictors affecting other models.

Extracted insights on customer churn behavior and key factors influencing churn.

Communicated findings, including suggestions for improvements and further analysis, to stakeholders.

Suggested advanced modeling techniques, such as decision trees, gradient boosting, and regularization, to improve predictions.

Project: Loan Default Risk Prediction using SAS, 2023

Collected and cleaned loan data in SAS, handled missing values and outliers, and conducted exploratory data analysis to uncover trends in borrower behavior related to loan default.

Created new variables like debt-to-income ratio and credit utilization, and selected relevant predictors through correlation analysis and statistical tests to enhance model accuracy.

Built logistic regression and decision tree models using SAS to predict the probability of loan default, tuning parameters for optimal performance.

Assessed model accuracy using ROC curves, AUC, and confusion matrices, and applied cross-validation techniques to ensure robustness across different borrower segments.

Visualized key findings with SAS plotting tools, segmented borrowers into risk tiers, and presented results and business recommendations to stakeholders for strategic decision-making.

EDUCATION

Master’s degree in information systems

University of Memphis, Memphis, TN

Graduated: 2024



Contact this candidate