Post Job Free
Sign in

Data Analyst Azure

Location:
Wylie, TX
Posted:
September 10, 2025

Contact this candidate

Resume:

Nikhil Dasari

Sr. Data Analyst

225-***-**** ***************@*****.*** McKinney, Texas - 75071

PROFESSIONAL SUMMARY:

Senior Data Analyst with 6+ years of experience in data analytics, data visualization, and data governance, specializing in healthcare, education, and insurance industries.

Proficient in leveraging Azure and AWS platforms for building scalable and efficient data solutions, integrating diverse datasets, and enhancing analytics workflows.

Expert in scripting languages (Python, SQL, R, Spark) to perform data transformations, validations, and analyses for actionable business insights.

Skilled in developing dynamic dashboards and visualizations using Tableau and Power BI to enable real-time, data-driven decision-making.

Strong expertise in designing ETL pipelines using Azure Data Factory (ADF), AWS Glue, and PySpark for processing structured and unstructured data.

Demonstrated ability to implement data governance frameworks, ensuring data quality, compliance, and security across multiple projects.

Proven success in data migrations from on-premises systems to Azure and AWS cloud ecosystems, ensuring seamless transitions and optimized performance.

Adept at real-time data integration and processing using tools like Azure Event Hubs, Kafka, and AWS Redshift.

Experienced in collaborating with business stakeholders, IT teams, and engineers to deliver tailored data solutions aligned with organizational goals.

Certified in Azure Fundamentals (AZ-900) and Azure Data Fundamentals (DP-900), with a strong commitment to continuous learning and staying abreast of emerging technologies.

Experienced Azure Data Engineer with a proven track record in delivering end-to-end data management solutions, including ETL processes using SQL Server and Azure data services.

Expertise in developing and optimizing Azure Data Factory (ADF) pipelines for efficient ETL workflows, automating Power BI reports, and managing complex data transformations to support business intelligence and data-driven decision-making.

Skilled at migrating SQL databases to Azure Data Lake, Azure SQL Database, and Azure Databricks to ensure seamless cloud integration.

Recognized for expertise in managing comprehensive data solutions, ensuring top-tier data quality, security, and compliance.

Proven success in cloud migration initiatives and implementing process improvements that drive operational efficiency.

Dedicated to delivering innovative, data-driven insights and fostering a culture of excellence in data engineering.

TECHNICAL SKILLS:

Cloud Platforms: AWS (Redshift, Glue, S3, Kinesis, EMR), Azure (Data Factory, Synapse Analytics, ADLS)

Data Analytics & Visualization: Tableau, Power BI, SQL, Python (Pandas, NumPy), R, Jupyter Notebook

ETL & Big Data Tools: PySpark, Hadoop, Kafka, SSIS, AWS Glue

Databases: SQL Server, MySQL, PostgreSQL, Redshift, Azure SQL Database

Scripting Languages: Python, R, SQL, Shell Scripting

Data Governance: Metadata Management, Data Lineage, RBAC, Data Quality Frameworks

Tools & Frameworks: JIRA, Confluence, Git, Airflow

Certifications: Azure Fundamentals AZ-900, Azure Data Fundamentals DP-900

CERTIFICATIONS:

Azure Fundamentals AZ-900 (2024)

Azure Data Fundamentals DP-900 (2024)

PROFESSIONAL EXPERIENCE:

Progate Technologies Inc Sr. Data Analyst June 2023 – Present

Client: Benefit Focus

Developed an end-to-end data analytics framework for Benefit Focus to integrate internal and external data sources, providing advanced insights into patient care, operational efficiency, and compliance metrics. Led the transformation initiative to enhance business intelligence, analytics, and governance by utilizing Azure cloud services for seamless data integration and reporting.

Roles and Responsibilities:

Built and optimized ETL pipelines using Azure Data Factory (ADF) to transform and load healthcare data into Azure Data Lake and Synapse Analytics.

Designed and implemented Power BI dashboards to visualize patient care metrics, hospital utilization, and financial performance.

Conducted advanced data analysis using Python, R, and SQL to identify trends in healthcare delivery and improve patient outcomes.

Ensured data governance by implementing role-based access controls (RBAC) and metadata management across Azure resources.

Managed real-time data ingestion from medical devices and third-party APIs using Azure Event Hubs and Azure Stream Analytics.

Validated and monitored data quality, establishing automated checks to ensure accuracy and compliance with healthcare regulations.

Collaborated with stakeholders to define KPIs and align dashboards with strategic objectives.

Optimized data models for performance tuning, reducing query response times by 35%.

Conducted root cause analysis to resolve data discrepancies and improve reporting accuracy.

Participated in Agile workflows, managing sprints and tasks using JIRA and Confluence.

Louisiana State University Data Analyst August 2021 – May 2023

Client: Louisiana State University

Migrated on-premises data systems to AWS cloud, integrating disparate datasets into a centralized data lake, and implemented analytics workflows for real-time decision-making. Designed and implemented an AWS-based centralized data platform for managing student, academic, and administrative data, enabling advanced analytics and reporting.

Roles and Responsibilities:

Developed ETL pipelines using AWS Glue to extract, transform, and load student and administrative data into AWS Redshift and S3.

Designed dynamic dashboards using Tableau to visualize enrollment trends, academic performance, and financial analytics.

Conducted data migration from on-premises SQL databases to AWS cloud platforms, ensuring data integrity and consistency.

Implemented streaming data solutions using Kafka and AWS Kinesis to support real-time analytics.

Ensured compliance with university data governance policies by implementing data lineage and metadata tracking.

Leveraged AWS IAM policies for secure access management and compliance with FERPA standards.

Designed Python scripts to automate data validation processes, reducing manual effort by 40%.

Partnered with academic departments to identify analytics needs and deliver tailored solutions.

Monitored and troubleshot data pipeline performance, resolving connectivity and latency issues.

Conducted performance tuning for Tableau dashboards, optimizing data retrieval and visualization speeds.

Assisted in designing optimized data models to improve query performance and enhance the efficiency of data storage and retrieval.

Ensured compliance with university data governance policies and protected sensitive student and financial information.

Collaborated with LSU’s IT team and academic departments to understand data requirements and provide technical solutions.

Provided ongoing support for the Azure-based data platform, troubleshooting any issues related to data pipelines or reporting tools.

Cognizant Data Analyst August 2018 – August 2021

Client: Selective Insurance, Prudential Insurance

Developed data pipelines and analytics frameworks to manage and optimize claims data, supporting various lines of insurance, including auto, home, and commercial policies. Contributed to insurance domain projects by designing scalable AWS-based data engineering solutions to improve claims processing and operational efficiency.

Duties:

●Built and maintained ETL pipelines using AWS Glue, S3, and Redshift for seamless data integration.

●Processed large datasets using PySpark on AWS EMR, reducing data processing time by 30%.

●Designed Airflow workflows to orchestrate data processing tasks and automate data pipeline execution.

●Developed Tableau dashboards to analyze claims trends, fraud detection metrics, and policy performance.

●Ensured data quality by implementing validation scripts in Python and enforcing AWS Glue Data Catalog best practices.

●Optimized database queries and storage in Redshift, enhancing reporting efficiency.

●Collaborated with data scientists to support predictive modeling efforts using claims history data.

●Ensured data security by integrating AWS KMS and IAM for encryption and access management.

●Supported Agile workflows, coordinating with onshore and offshore teams to meet project milestones.

●Conducted root cause analysis for data anomalies and implemented solutions to prevent recurrence.

●Implemented and maintained a data pipeline for consuming data from a Kafka topic. Automated a Python script with Airflow to process the data efficiently within the cloud.

●Utilized Apache Spark on a cloud platform (AWS EMR) to develop a PySpark program for data frame creation and transformation.

●Leveraged cloud-based big data processing frameworks like Spark to handle large datasets efficiently.

●Demonstrated expertise in building web services on the cloud to interact with external data sources.

●Worked closely with business analysts, product managers, and other stakeholders to gather data requirements.

EDUCATION:

Louisiana State University Aug 2021 – May 2023

Master’s in Computer Science

Koneru Lakshmaiah Education Foundation June 2015 – May 2019

Bachelor’s of Technology in Electronics and Communication Engineering



Contact this candidate