Post Job Free
Sign in

Data Analyst with 4 Years in Analytics and BI

Location:
United States
Salary:
70000
Posted:
February 12, 2026

Contact this candidate

Resume:

THUMU SAI VAMSHI

Data Analyst

+1-919-***-**** ***************@*****.*** MISSOURI 64093 LinkedIn SUMMARY

Data Analyst with around 4 years of experience transforming raw datasets into analytical insights for decision-making in insurance, financial services, and healthcare support environments. Skilled in SQL, Python, Power BI, Tableau, Excel (Power Query), and modern cloud data platforms to build automated dashboards, analyze trends, and support business reporting. Experienced in collaborating with cross-functional teams, improving data quality, and enabling executive reporting through scalable ETL workflows and KPI-driven analytics.

SKILLS

Data Analytics & Programming: SQL (T-SQL, PL/SQL), Python (Pandas, NumPy) BI & Visualization: Power BI (DAX), Tableau, Excel (Power Query, Pivot Tables), Looker Studio, KPI & Trend Reporting ETL & Automation: ETL Pipelines, Data Cleaning & Transformation, Power Query, API-based Extraction, DBT, Docker Databases & Warehousing: Snowflake, PostgreSQL, MS SQL Server, Google BigQuery, Azure Synapse Analytics Techniques: EDA, Trend Analysis, A/B Testing, Predictive Insights, Root-Cause Analysis Healthcare / Insurance Analytics: Claims Data Analysis, Provider/Member Data, Operational Reporting, Data Quality Validation, Financial & Compliance Reporting

Tools & Collaboration: JIRA, Confluence, Git, Agile/Scrum Cloud & Big Data: AWS (Glue, S3, Lambda), Azure Cloud Compliance: HIPAA Awareness, PHI Data Handling, Audit Documentation EXPERIENCE

Data Analyst, MetLife, USA Jun 2025 – Present

Developed automated SQL and Python workflows to process customer and policy data, cutting reporting turnaround 35% for actuarial and finance teams.

Built Power BI dashboards visualizing claims trends, customer behavior, and policy performance; improved leadership visibility 30%.

Performed data quality validation across Snowflake + AWS Redshift, integrating EHR feeds (Epic Clarity/Cerner); reduced discrepancies 25% and enhanced downstream risk-scoring accuracy.

Automated Excel (Power Query) workflows for claims reconciliation and data quality checks, eliminating 10+ hours/week of manual effort across actuarial and compliance reporting.

Created Excel (Power Query) and Python automation scripts, eliminating 8–10 hours/week of manual reporting.

Supported migration of healthcare analytics workloads to AWS and Snowflake, using dbt and Docker for versioned models, containerized pipelines, and HIPAA-compliant audit documentation.

Performed ad-hoc analysis on claims payouts, denial drivers, provider performance, and member retention, delivering actionable insights to strategy and population health teams. Data Analyst, Hexaware Technologies, INDIA Aug 2020 – Jun 2023

Performed SQL analysis on claims, enrollment, provider, and pharmacy datasets for health insurance clients, identifying cost drivers and improving SLA compliance by 18%.

Built Tableau dashboards for real-time claims trending, denial analysis, provider performance, and membership KPIs; reduced reporting cycle from days to minutes for onshore actuarial and operations teams.

Designed Power Query and Python automation workflows, cutting manual data prep by 40% and eliminating 10+ hours/week of repetitive tasks for the analytics team.

Containerized analytical workflows and ETL pipelines using Docker, enabling reproducible, HIPAA-compliant environments and seamless deployment across AWS and Snowflake ecosystems.

Conducted data quality validation on claims, eligibility, and benefits data in SQL Server, PostgreSQL, and Snowflake; reduced duplicate/inconsistent records by 35% and improved downstream reporting accuracy.

Supported cloud migration of healthcare analytics workloads (SQL Server Snowflake and AWS), performing end-to-end reconciliation and ensuring audit-ready pipelines for clients.

Collaborated with onshore claims, underwriting, and network teams to resolve source-system discrepancies and deliver accurate, executive-ready reporting.

Accompanied data profiling and validation on large-scale claims and EHR extracts; reduced duplicate/inconsistent records by 30% and enhanced downstream reporting reliability. EDUCATION

Master’s in Computer Science, University of Central Missouri, Warrensburg, MO, USA Aug 2023 – May 2025 PROJECTS

Enterprise Retail Analytics Platform (Azure Snowflake dbt Power BI) Tech Stack: Azure Blob Storage, Azure Data Factory, Snowflake, dbt, Apache Airflow, Power BI, SQL End-to-End Flow (step-by-step):

Ingested retail transaction, customer, product, and store files into Azure Blob Storage.

Built ADF parameterized pipelines with watermark-based incremental loading and dynamic file handling.

Loaded raw data into Snowflake RAW schemas using scalable copy patterns.

Implemented dbt Bronze/Silver/Gold models with incremental MERGE logic, deduplication, and SCD-2 snapshots.

Added dbt data quality tests, documentation, and exposures for BI readiness.

Orchestrated ingestion + transformation workflows using Apache Airflow DAGs.

Delivered Power BI dashboards for sales trends, customer KPIs, and store performance. Airbnb Marketplace Analytics Data Warehouse (Cloud Storage Snowflake dbt BI) Tech Stack: AWS S3 / Azure Blob Storage, Snowflake, dbt, SQL, Power BI End-to-End Flow (step-by-step):

Ingested listings, hosts, reviews, and availability data into cloud storage landing.

Loaded raw datasets into Snowflake RAW with standardized schemas.

Designed dimensional data models (facts & dimensions) using dbt incremental models.

Applied business transformations for pricing, occupancy, and host performance metrics.

Enforced data quality & freshness checks using dbt tests.

Published curated Gold-layer marts for analytics consumption.

Built Power BI dashboards for geographic trends, seasonality, and revenue insights. US Accidents Geospatial Analytics Pipeline (Public Data Snowflake dbt Power BI) Tech Stack: Azure Blob Storage, Snowflake, dbt, SQL, Power BI (Maps) End-to-End Flow (step-by-step):

Ingested large-scale US accident datasets into Azure Blob Storage.

Loaded raw data into Snowflake RAW, handling mixed datatypes and invalid values.

Cleaned and standardized data in dbt staging models.

Implemented deduplication logic using SQL window functions.

Built analytical marts for severity analysis, time-based trends, and location hotspots.

Optimized queries for large volumes using Snowflake best practices.

Delivered map-based Power BI dashboards using latitude/longitude analytics. Metadata-Driven Cloud Ingestion & Analytics Framework (Azure Blob Snowflake dbt BI) Tech Stack: Azure Data Factory, Azure Blob Storage, Snowflake, dbt, Power BI End-to-End Flow (step-by-step):

Designed a metadata-driven ingestion framework using control tables.

Dynamically ingested multiple datasets using ADF Lookups, ForEach loops, and parameters.

Applied incremental watermark logic for efficient, idempotent loads.

Loaded data into Snowflake and standardized schemas automatically.

Transformed data using dbt incremental models and reusable macros.

Enabled scalable analytics delivery to Power BI without pipeline rewrites.



Contact this candidate