Post Job Free
Sign in

Data Analyst Senior

Location:
Waxhaw, NC
Salary:
70000
Posted:
September 10, 2025

Contact this candidate

Resume:

Nikitha M

Email: *.***********@*****.***

Mobile: +1-945-***-****

Senior Data Analyst

PROFESSIONAL SUMMARY:

Over 5+ years of experience in analytical thinking and problem-solving, delivering innovative solutions while maintaining attention to detail across diverse projects. Strong communication skills to technical and non-technical audiences.

Engineered solutions with strong PL/SQL skills, writing and analyzing complex queries and stored procedures, connecting dots across applications for an end-to-end view. Expertise with Microsoft Office suite.

Built high-availability ETL workflows, demonstrating proficiency in connecting dots across various applications and businesses to understand the end-to-end view. Works well with minimal supervision.

Created dynamic dashboards to visualize product trends and KPIs, enhancing cross-functional decision-making. Good at identifying priorities and managing multiple projects simultaneously as a team player.

Utilized SQL and Python for advanced data cleaning and feature engineering, supporting machine learning workflows. Willingness to ask questions and reach out for assistance as required in agile teams.

Led cloud integration projects unifying on-premises and cloud data via APIs, enabling data lake centralization. Ability to effectively communicate across the organization depending on the audience.

Developed star, snowflake, and normalized schemas that enhanced performance and reduced storage costs, enabling enterprise-wide reporting initiatives. Expertise with Oracle Exadata or 10g and above.

Built predictive models to forecast user churn and recommend upsells, embedding outputs into executive-level dashboards for rapid decision-making. Know-how effort and financials estimation.

Automated infrastructure deployment through Azure DevOps and Terraform, implementing CI/CD pipelines for versioning and infrastructure consistency. Strong communication and presentation skills.

Enforced data governance using Azure Purview and GCP IAM to track lineage and catalog metadata, implementing secure, role-based access. Team player who can influence and guide team for success.

Engineered healthcare data frameworks compliant with HL7 and FHIR standards, enabling analytics for population health and claims optimization. Works well in a team environment.

Designed real-time streaming pipelines using Azure Event Hub and GCP Pub/Sub to process log files and transactional events. Know-how working in Agile/scrum teams for prioritization of work.

Translated functional business needs into ETL jobs and interactive dashboards, aligning technical deliverables to dynamic stakeholder expectations. Analytical thinking and problem-solving skills.

Mentored junior engineers via documentation and live workshops, fostering collaboration across distributed engineering teams. Ability to write and analyze complex queries, stored procedures.

Collaborated with QA teams to trace pipeline failures and implement automated checks, conducting RCA investigations to improve data quality. Strong PL/SQL skills and attention to detail.

Managed refresh cycles for executive dashboards by balancing compute resources with data availability, ensuring performance monitoring. Good with identifying priorities and managing multiple projects.

Maintained Git-based version control across data projects, implemented automated CI/CD pipelines through Azure DevOps. Willingness to ask questions and reach out for assistance as required.

Conducted statistical A/B testing using chi-square, t-tests, and confidence intervals to measure feature performance. Innovative thinking skills and expertise with Microsoft Office suite usage.

TECHNICAL SKILLS:

Languages - SQL, T-SQL, Python, PySpark, DAX, PL/SQL

Cloud Platforms - Azure (ADF, Synapse, Databricks, ADLS, Event Hub), GCP (BigQuery, Cloud Composer, Dataflow, Looker Studio)

BI Tools - Power BI, Tableau, Looker

ETL & Pipelines - Azure Data Factory, Databricks, SSIS, Airflow

Databases - SQL Server, Oracle, MySQL, BigQuery, Oracle Exadata

Version Control & CI/CD - Git, Azure DevOps, Terraform

Other Tools - JIRA, Confluence, FHIR, HL7, JSON, XML, Parquet, Microsoft Office

Processes - Agile, Scrum

PROFESSIONAL EXPERIENCE:

WELLS FARGO April 2025 – Present

Senior Data Analyst

Responsibilities:

Led the design of scalable ETL pipelines using ADF and Databricks, improving reliability and accelerating financial reporting workflows across business, compliance, and operations departments, demonstrating analytical thinking. This involved strong PL/SQL skills to analyze complex queries.

Built real-time ingestion pipelines using Azure Event Hub and Synapse Streaming, supporting low-latency dashboards and proactive alerting for executives, showcasing innovative thinking skills and attention to detail. I effectively communicate across the organization.

Created Power BI dashboards with KPIs, filters, and drilldowns, enhancing visibility for cross-functional teams and improving forecast accuracy, demonstrating strong communication and presentation skills to both technical and non-technical audiences.

Enhanced SQL processing in Synapse Analytics to support regulatory reporting, reducing runtime from hours to minutes and optimizing access to mission-critical compliance metrics, showcasing proficiency with query tools and strong PL/SQL skills.

Integrated ADF pipeline monitoring with logs, retries, and alerts, reducing issue resolution times and improving SLA adherence for data availability and system health, demonstrating problem-solving skills and attention to detail in a team environment.

Designed partitioned, compressed, and indexed warehouse structures, boosting performance for high-concurrency business intelligence workloads and improving end-user report access times, showcasing expertise with Microsoft Office suite usage for documentation.

Authored full documentation including architecture diagrams, schema definitions, and lineage maps to streamline audits, improve onboarding, and support knowledge transfer across engineering and analytics teams, demonstrating attention to detail.

Conducted root-cause analysis and validation scripting for pipeline mismatches, deploying automated data quality checks to improve ingestion integrity and operational resilience, showcasing analytical thinking and problem-solving skills within Agile sprints.

Automated global aggregation workflows using BigQuery and Composer to unify KPIs across time zones, ensuring timely insights for multinational business units and executive dashboards, working well in a team environment with minimal supervision.

Managed version control and CI/CD pipelines via Git and Azure DevOps, maintaining deployment traceability, pull request integrity, and consistency across dev, QA, staging, production environments, identifying priorities and managing multiple projects simultaneously.

UNITED HEALTH CARE INSURANCE September 2024 – March 2025

Data Analyst

Responsibilities:

Developed ADF pipelines to ingest structured claims and unstructured provider records, reducing latency and enabling timely analytics for operational, clinical, and compliance reporting teams, demonstrating analytical thinking and attention to detail.

Created dynamic Power BI dashboards with slicers, filters, and drilldowns to visualize claims status, provider metrics, and care quality indicators for compliance, performance tracking, and strategic healthcare decision-making, showcasing strong communication skills.

Transformed large-scale clinical data using PySpark in Databricks, cleansing and validating records weekly to support high-accuracy reporting for billing, risk, quality improvement teams, demonstrating proficiency with query tools and PL/SQL skills.

Enforced HIPAA-compliant data flows through encryption, anonymization, and secure transmission, ensuring safe handling of PHI and aligning with healthcare data privacy and enterprise governance standards, showcasing attention to detail.

Built reusable ETL components—filters, templates, and lookup tables—reducing code redundancy and standardizing ingestion patterns across claims, eligibility, and provider datasets, demonstrating innovative thinking and problem-solving skills.

Integrated Azure Synapse, ADF, on-prem SQL, and ADLS Gen2 to consolidate enrollment, claims, and provider data into a unified reporting environment for actuarial and compliance stakeholders, connecting dots across various applications.

Delivered population health dashboards to monitor chronic condition management, care gaps, resource utilization, enabling data-driven outreach and improved engagement within managed care programs, communicating effectively across the organization.

Generated structured Python datasets supporting cost modeling, risk scoring, and cohort stratification, helping actuarial teams optimize intervention targeting and improve healthcare service delivery efficiency, demonstrating analytical thinking.

Applied RBAC, Key Vault, and service principals to manage secrets and control access to PII/PHI data across ingestion, transformation, and visualization layers in secure Azure environments, working well in a team environment.

Actively participated in Agile sprints, standups, and retrospectives to coordinate backlog delivery, improve collaboration, and ensure consistent progress in a healthcare-focused data engineering team, identifying priorities and managing multiple projects.

HEWLETT PACKARD September 2021 – July 2023

Data Analyst

Responsibilities:

Built SSIS and Azure Data Factory pipelines to extract and integrate CRM, marketing, and operations data, improving automation, data quality, and consistency across global reporting and analytics platforms, demonstrating analytical thinking.

Developed Power BI dashboards for KPIs, campaign ROI, and conversion tracking, reducing manual reporting and enabling marketing leadership to monitor performance across channels in real time, showcasing strong communication skills.

Validated data between Oracle, Salesforce, and SQL Server to ensure source-to-target accuracy, resolving discrepancies and increasing trust in business intelligence outputs across global sales and marketing teams, demonstrating attention to detail.

Worked with cross-functional teams to define KPIs, standardize metrics, and wireframe dashboards aligned to strategic goals, improving adoption and accelerating data-driven decision-making at leadership levels, communicating effectively.

Created automated Python scripts for scraping competitor pricing data and generating structured intelligence reports, supporting product positioning and pricing strategy across enterprise business units, showcasing innovative thinking.

Delivered churn models by blending CRM, campaign exposure, and support interactions, generating predictive insights that reduced customer churn across targeted segments, demonstrating problem-solving skills and analytical thinking.

Optimized SQL queries using CTEs, indexes, and analytic functions to improve Power BI refresh rates and enhance the responsiveness of executive-facing dashboards and reports, showcasing proficiency with query tools.

Authored documentation for refresh schedules, schema changes, governance workflows, and approval processes, improving operational consistency and accelerating onboarding across the enterprise analytics team, demonstrating attention to detail.

Migrated legacy Excel-based reports to Azure reporting platforms, reducing refresh latency, increasing system scalability, and improving report availability across business lines, connecting dots across various applications.

Designed reusable SSIS packages and modular ETL components to reduce development redundancy and enable scalable data transformation processes across marketing and operations reporting domains, working well in a team environment.

S&P GLOBAL June 2019 – August 2021

Programmer Analyst

Responsibilities:

Developed SQL-based ETL pipelines for financial market data ingestion, reducing time-to-insight across dashboards, portfolio models, and market research platforms used by finance and analytics teams, demonstrating analytical thinking.

Translated reporting requirements into automated SQL workflows and ETL logic, delivering benchmark analytics and performance metrics to investment managers through scheduled data pipelines and Tableau dashboards, showcasing strong PL/SQL skills.

Engineered Python and T-SQL scripts to standardize security data, parse financial feeds, and automate Tableau dashboard refreshes for analysts, traders, and strategic finance stakeholders, demonstrating proficiency with query tools.

Built interactive Tableau dashboards visualizing macroeconomic indicators, equity performance, and fund analytics, reducing manual reporting workload across distributed investment teams, showcasing strong communication skills.

Participated in Agile stand-ups and sprints with product and data teams, delivering backlog items, refining reporting logic, and continuously improving pipeline reliability and business data accuracy, working well in a team environment.

Conducted peer code reviews and developed unit tests for Python scripts and SQL procedures, ensuring performance, clean deployment, and compliance with financial governance requirements, demonstrating attention to detail.

Tuned stored procedures and optimized SQL using indexing and refactoring, reducing dashboard query latency and improving responsiveness for concurrent finance report consumers, showcasing problem-solving skills and innovative thinking.

Documented source-to-target mappings, data definitions, and transformation logic; validated financial KPIs with QA teams to ensure reporting accuracy and regulatory compliance, demonstrating expertise with Microsoft Office suite usage.

Migrated SSRS reports and flat-file ETL logic into Azure Data Factory pipelines using parameterized SQL templates, increasing automation, accuracy, and audit-readiness of enterprise reports, connecting dots across various applications.

Managed Git-based version control, led SQL code reviews, and maintained reusable data components, streamlining collaboration across global engineering teams and reducing risk in deployments, identifying priorities and managing multiple projects.

Educational Details:

Masters in Business Analytics and Artificial Intelligence - University of Texas at Dallas

Bachelor of Business Administration - M. S. Ramaiah University of Applied Sciences



Contact this candidate