Post Job Free
Sign in

Data Analyst Senior

Location:
Hyderabad, Telangana, India
Posted:
June 20, 2025

Contact this candidate

Resume:

SPANDANA BHONAGIRI

SENIOR DATA ANALYST

Email: ********@*****.***

LinkedIn: linkedin.com/in/spandanabhonagiri

PROFESSIONAL SUMMARY:

Results-driven Senior Data Analyst with 8+ year’s expertise in designing and automating ETL processes using Azure Data Factory, AWS Glue, and Hadoop-based solutions for scalable data integration and analytics.

Proficient in developing interactive dashboards and visual reports using Power BI, Tableau, and Amazon QuickSight to enable data-driven decision-making across organizations.

Skilled in SQL query optimization, indexing, and stored procedure development for databases such as PostgreSQL, MySQL, Oracle, Redshift, and AWS RDS to enhance data retrieval efficiency.

Hands-on experience in statistical modeling, regression analysis, clustering, and predictive analytics using R, Python, and SAS to drive business insights and strategic planning.

Experienced in enterprise HRIS environments, including PeopleSoft HRMS, with proven ability to analyze complex benefits and payroll datasets, automate reporting workflows using Alteryx and SSIS, and ensure HIPAA-compliant data handling in high-volume operational settings.

Expertise in Python scripting for data wrangling, transformation, and advanced visualization using Matplotlib, Seaborn, and NLP techniques to extract meaningful insights from textual data.

Strong experience in data visualization and storytelling, leveraging Power BI DAX calculations, Tableau Data Blending, and R Shiny applications for interactive reporting.

Proficient in cloud-based data storage and integration using Azure Data Lake Storage, AWS S3, and for large-scale analytics and reporting.

Expert in SAS Enterprise Guide, SAS Studio, and IBM SPSS for advanced statistical analysis, hypothesis testing, and predictive modeling.

Expert in integrating multiple data sources, including Google Tag Manager, Google Analytics, and CRM platforms, to build comprehensive data ecosystems.

Experienced in designing and implementing role-based security policies and access controls in PostgreSQL, MySQL, and AWS RDS, ensuring compliance with data governance and privacy regulations.

Proficient in automating and integrating data workflows using AWS Step Functions, Azure Logic Apps, and Apache Airflow, enhancing data pipeline efficiency and reducing manual intervention.

Skilled in implementing Google Analytics tracking and A/B testing methodologies to optimize user experience, leading to data-driven improvements in conversion rates.

Expert in developing and maintaining scalable big data solutions with Hadoop, Apache Hive, Impala, and Apache Pig for processing and analyzing large, distributed datasets.

Adept at automating data cleansing, transformation, and preparation tasks using Jupyter Notebook and R (tidyverse) to maintain data consistency and accuracy.

Proficient in designing complex financial and statistical models in Excel, utilizing pivot tables, Power Query, VBA macros, and advanced formulas to enhance business reporting.

Proficient in transforming business needs into scalable data solutions, skilled in building automated reports, performing root cause analysis, and producing performance dashboards that support strategic decision-making in benefits administration and healthcare analytics.

Hands-on experience in developing automated ETL pipelines that integrate structured and unstructured data sources, supporting both real-time and batch processing for business intelligence applications.

Strong expertise in optimizing MySQL and PostgreSQL queries, significantly reducing execution time and improving database efficiency for enterprise-level reporting.

Collaborative team player with a track record of defining business KPIs and developing analytical solutions using SQL, Python, and cloud-based BI tools.

Strong communicator and problem solver, adept at bridging technical and non-technical teams, writing clear documentation, and independently managing complex data tasks with accuracy, discretion, and adherence to evolving industry standards and legislative requirements.

Experienced in implementing advanced NLP and machine learning techniques in Python and R to extract insights from unstructured data, driving data-driven decision-making.

Experienced in database administration, backup strategies, and disaster recovery planning for SQL Server, MySQL, and Oracle to ensure data availability and security.

TECHNICAL SKILLS:

Programming & Language

Python, R, SAS, SQL, PL/SQL

ETL & Data Integration

Azure Data Factory, AWS Glue, Apache Airflow, Apache Sqoop, AWS Step Functions, Azure Logic Apps, Oracle PL/SQL, AWS Data Pipeline

Big Data & Processing

Hadoop, MapReduce, HDFS, YARN, Apache Hive, Apache Impala, Apache Pig, Apache Flume, Cloudera, AWS EMR

Data Visualization

Power BI (DAX, Data Blending), Tableau, Amazon QuickSight, R Shiny, Excel (PivotTables, Power Query, VBA macros)

Databases & SQL

PostgreSQL, MySQL, Oracle, Amazon Redshift, AWS RDS, SQL Server, Teradata, Snowflake, NoSQL (MongoDB, Cassandra, HBase)

Cloud Platforms

Microsoft Azure, AWS

Data Governance & Security

Role-based Access Control (RBAC), PostgreSQL & MySQL Security, AWS IAM Policies, Data Privacy Compliance

Performance Optimization

SQL Query Optimization, Indexing, Partitioning, Stored Procedures, Database Tuning (PostgreSQL, MySQL, Oracle)

Business Intelligence & Reporting

KPI Definition, Data Storytelling, Interactive Reporting (Power BI, Tableau, Excel)

PROFESSIONAL EXPERIENCE:

Client: 7-Eleven, Irving, TX Feb 2024 – Present

Role: Sr. Data Analyst

Responsibilities:

Automated ETL processes leveraging Azure Data Factory, Azure Logic Apps, and Azure Functions to ensure real-time and batch data integration.

Configured and managed Azure Blob Storage and Azure Storage Explorer to efficiently store and retrieve structured and unstructured data.

Utilized advanced Microsoft Excel functions such as VLOOKUP, nested formulas, PivotTables, and Macros to perform complex data manipulation, reconciliation, and reporting tasks across various functional areas.

Designed, developed, and deployed interactive dashboards and insightful visualizations in Tableau to support executive decision-making, HR operations, and payroll trend analysis.

Configured and managed Azure Monitor and Azure Application Insights to track user interactions, monitor performance, and analyze system logs for insights.

Integrated Azure Synapse Analytics, Azure SQL Database, and Power BI to perform advanced data transformations and generate real-time business insights.

Integrated Hadoop with Apache Hive and Impala to enable SQL-based querying and data analysis on distributed datasets.

Conducted in-depth analysis of health care claims data to identify cost-saving opportunities, fraud indicators, and coverage utilization trends for internal stakeholders.

Supported payroll operations by validating pay period adjustments, retroactive payments, and deduction calculations within PeopleSoft and reporting discrepancies.

Developed role-based access controls and security policies in PostgreSQL to manage user privileges effectively.

Wrote complex MySQL queries, joins, and stored procedures to support business reporting and operational analytics.

Performed data wrangling and transformation using Python and SQL to prepare clean datasets for business intelligence and analytics.

Cleaned, processed, and transformed large datasets using Jupyter Notebook, ensuring accuracy and consistency in data analysis workflows.

Implemented Hadoop-based big data solutions, processing and analyzing massive datasets using HDFS, MapReduce, and YARN for business intelligence.

Conducted hypothesis testing and regression analysis in R to identify key business trends and support strategic decision-making.

Conducted advanced statistical analysis, including regression, factor analysis, and clustering, using IBM SPSS to uncover business insights.

Provided SAS training and technical support to team members, ensuring best practices in data analysis, statistical modeling, and reporting.

Collaborated with cross-functional teams to gather, analyze, and translate complex business requirements into functional and technical specifications for PeopleSoft HRMS-based solutions.

Designed and delivered custom SSRS reports using SQL Server Reporting Services to support benefits administration, pension fund monitoring, and payroll processing audits.

Ensured data compliance with HIPAA guidelines and other data protection standards while handling sensitive healthcare and employee compensation data within HRMS systems.

Utilized SAS Enterprise Guide and SAS Studio for interactive data exploration, statistical analysis, and reporting.

Created and managed complex Excel models, pivot tables, and advanced formulas to analyze large datasets and generate insightful business reports.

Created interactive, reusable, and well-documented Jupyter Notebook reports to visualize and present data-driven insights effectively.

Designed interactive dashboards and visualizations in R Shiny to provide stakeholders with real-time analytical insights.

Automated repetitive tasks and data processing using Excel VBA macros, enhancing efficiency and reducing manual effort.

Utilized Python libraries Matplotlib and Seaborn to create advanced data visualizations for trend analysis and stakeholder presentations.

Extracted, transformed, and loaded large volumes of structured HR and financial data using SQL queries and SSIS packages within the Microsoft SQL Server BI Stack for reporting and analysis.

Created automated workflows in Alteryx for data cleansing, joining disparate datasets, and generating analytics-ready outputs to streamline reporting pipelines.

Maintained a high level of data accuracy and consistency by performing regular quality checks and reconciliation using Excel tools and SQL validation scripts.

Provided training and support on IBM SPSS functionalities, ensuring team members effectively utilize statistical tools for analysis.

Created and managed Power BI workspaces, content packs, and apps to facilitate seamless data sharing and collaboration across teams.

Environment: Azure Data Factory, Azure Monitor, Azure Synapse Analytics, Logic Apps, Functions, Blob Storage, Storage Explorer, R, Hadoop, HDFS, MapReduce, YARN, Power BI, Python, SQL, Jupyter Notebook, SAS Enterprise Guide, SAS Studio, Excel, PostgreSQL, IBM SPSS, Oracle SQL, PL/SQL, R Shiny, Excel VBA, MySQL, Matplotlib, Seaborn, Apache Hive, Impala.

Client: McKesson, Irving, Texas Aug 2022 – Jan 2024

Role: Sr. Data Analyst

Responsibilities:

Designed and implemented scalable data pipelines using AWS Glue, AWS Lambda, and Amazon S3 to process and analyze large datasets, ensuring high availability and cost efficiency.

Automated data ingestion and transformation workflows using AWS Step Functions and AWS Data Pipeline to streamline data processing across multiple sources.

Developed and implemented statistical models using Python and R to analyze large datasets and derive actionable insights for business decision-making.

Designed and optimized ER diagrams using Amazon RDS to structure relational databases efficiently and ensure seamless data relationships for analytics and reporting.

Leveraged Apache Sqoop to ingest structured data from relational databases into Hadoop clusters for scalable analytics and storage.

Demonstrated critical thinking by independently resolving complex data issues and anomalies through root cause analysis and corrective SQL logic updates.

Supported legislative and regulatory reporting by preparing statutory reports and benefit summaries aligned with organizational and governmental policies.

Served as a key liaison between IT, HR, and Finance teams to ensure consistent data interpretation, reporting alignment, and technical issue resolution.

Managed large-scale Oracle databases, ensuring high availability, performance tuning, and data integrity.

Implemented predictive analytics and machine learning models in Jupyter Notebook using Scikit-learn and TensorFlow for business forecasting and trend analysis.

Supported organizational compliance by ensuring secure handling, transmission, and storage of confidential employee and benefits information.

Wrote and maintained documentation for technical processes, including report specifications, data flow diagrams, and workflow logs for audit and support purposes.

Designed interactive dashboards and visual reports in Excel to support executive decision-making and operational performance tracking.

Created DAX measures and calculated columns to enhance Power BI reporting and enable advanced data analysis.

Integrated Python with SQL databases to perform advanced querying, data manipulation, and reporting for operational analytics.

Conducted performance tuning of MySQL queries, reducing execution time and enhancing overall database efficiency.

Utilized Apache Pig and MapReduce programming to transform raw data into actionable insights, enabling data-driven decision-making and improving reporting accuracy for business intelligence teams.

Conducted advanced statistical analyses, hypothesis testing, and regression modeling using SAS to derive actionable insights and optimize business operations.

Designed and optimized SQL queries, stored procedures, and functions in Oracle Database to support business intelligence and reporting needs.

Conducted A/B testing and funnel analysis using Google Analytics to enhance user experience and improve conversion rates.

Integrated IBM SPSS with external data sources such as SQL databases, Excel, and survey tools to enhance data analysis capabilities.

Applied data governance practices while managing personal health information (PHI) and personally identifiable information (PII), ensuring confidentiality and integrity of HR datasets.

Implemented data pipelines using PostgreSQL for ETL processing and integration with analytics platforms.

Used business intelligence tools to translate raw operational data into meaningful KPIs and performance metrics aligned with departmental goals.

Coordinated with business users and leadership to gather reporting needs and delivered timely, accurate reports through Excel models, Tableau dashboards, and SSRS outputs.

Provided system testing, UAT support, and defect resolution for enhancements and integrations related to HRIS systems, specifically focusing on benefits and payroll modules.

Developed automated Power BI reports, enabling real-time monitoring of key performance indicators (KPIs).

Managed SSIS workflows to automate scheduled data loads, archival processes, and error-handling routines to minimize manual intervention and system downtime.

Automated data cleaning and transformation processes in R using tidyverse, improving data quality and consistency for reporting.

Environment: AWS Glue, Amazon RDS, Lambda, S3, Step Functions, Data Pipeline, Python, R, Apache Sqoop, Hadoop, Oracle, Jupyter Notebook, Scikit-learn, TensorFlow, Excel, DAX, Power BI, SQL, MySQL, Apache Pig, MapReduce, SAS, Oracle Database, Google Analytics, IBM SPSS, PostgreSQL, ETL.

Client: Fusion Microfinance, New Delhi, India Oct 2019 – Feb 2022

Role: Data Analyst

Responsibilities:

Designed and deployed Hadoop-based ETL pipelines, ensuring seamless ingestion, transformation, and storage of diverse data sources.

Extracted, transformed, and loaded large datasets from multiple sources into SAS environments for in-depth data analysis and predictive modeling.

Integrated Oracle databases with other enterprise systems and data sources to ensure seamless data flow across platforms.

Designed and optimized ETL pipelines using Azure Data Factory to integrate data from various sources into Azure Data Lake Storage.

Created backup and disaster recovery strategies for MySQL databases to ensure business continuity and data security.

Performed indexing and query optimization in PostgreSQL, reducing execution time and improving performance.

Integrated R with SQL and NoSQL databases to extract, manipulate, and analyze large datasets efficiently.

Conducted statistical modeling and hypothesis testing within Jupyter Notebook to drive data-driven decision-making.

Leveraged Python-based NLP techniques to analyze textual data, extract meaningful insights, and support sentiment analysis projects.

Utilized IBM SPSS Syntax to automate data preparation, statistical analysis, and reporting processes for improved efficiency.

Conducted financial, statistical, and trend analyses using Excel functions such as VLOOKUP, INDEX-MATCH, SUMIF, and conditional formatting.

Monitored data processing and analytics workflows using Azure Monitor, Azure Log Analytics, and Application Insights to detect anomalies and improve performance.

Identified trends, patterns, and user segmentation insights using Google Analytics to support targeted marketing strategies.

Developed interactive dashboards and reports in Tableau to visualize key business metrics, enabling data-driven decision-making.

Designed and implemented Tableau workbooks using Data Blending techniques to integrate and analyze data from multiple sources, ensuring comprehensive insights.

Environment: Tableau Desktop, R, SQL, NoSQL, Hadoop, ETL, PostgreSQL, Azure Data Factory, Data Lake Storage, MySQL, IBM SPSS Syntax, Jupyter Notebook, Excel, Google Analytics, Azure Monitor, Log Analytics, Application Insights, Python, NLP, Oracle, SAS.

Client: Tata AIA Life Insurance, Mumbai, India Jul 2017 – Sep 2019

Role: Data Analyst

Responsibilities:

Developed and optimized Python scripts to automate data extraction, transformation, and loading (ETL) from multiple sources, improving efficiency and data accuracy.

Developed and optimized complex SQL queries on Amazon Redshift and AWS RDS to extract, transform, and load data for business intelligence reporting and data visualization.

Implemented indexing, partitioning, and performance optimization techniques in Oracle databases to improve query execution times.

Developed user access controls and role-based security mechanisms in MySQL, ensuring compliance with data privacy regulations.

Managed Hadoop cluster configurations, resource allocation, and job scheduling to maximize system performance and efficiency.

Applied machine learning algorithms in R, including clustering, classification, and anomaly detection, to derive actionable insights from business data.

Designed and implemented SAS macros, stored processes, and automated scripts to streamline data processing and reporting workflows.

Configured Amazon QuickSight dashboards and visualizations by integrating data from AWS Redshift, AWS S3, and Athena, providing actionable insights to stakeholders.

Developed and maintained Tableau dashboards with interactive filters, drill-downs, and dynamic visualizations to provide actionable insights.

Provided Excel training and documentation to team members, ensuring proficiency in data analysis, visualization, and automation techniques.

Environment: AWS SQL, Redshift, RDS, Tableau, Python, ETL, R, Machine Learning, Amazon QuickSight, S3, Athena, MySQL, Role-Based Security, Oracle, Hadoop, Resource Allocation, Excel, SAS, Macros.

EDUCATION:

Jawaharlal Nehru Technological University - Hyderabad Hyderabad, India

Bachelors in Computer Science June 2013 - May 2017



Contact this candidate