Post Job Free
Sign in

Data Analyst - Power BI, Tableau, SQL, Python Expert

Location:
United States
Posted:
December 09, 2025

Contact this candidate

Resume:

Data Analyst

Name: V Sai Harsha

Email: **********@*****.***

Phone No: +1-469-***-****,

LinkedIn: www.linkedin.com/in/sharshv02

Professional Summary:

Results-driven Data Analyst with around 5 years of experience in designing, developing, and optimizing interactive Power BI and Tableau dashboards, reports, and data models to drive business decisions.

Collaborate with business stakeholders to gather, analyze, and document detailed business and technical data requirements

Skilled in DAX, Power Query (M language), and advanced data modeling (Star & Snowflake schema) for building scalable, high-performance analytics solutions.

Experienced Data Analyst with a strong foundation in financial analytics, banking operations, and internal systems, applying computer science and mathematical principles to drive business insight and operational efficiency.

Skilled in performing exploratory data analysis (EDA) to identify trends, patterns, and anomalies, enabling data-driven decision-making across financial and operational domains.

Highly motivated and detail-oriented professional with excellent verbal communication skills, known for initiating process improvements and executing thorough analytical reviews in fast-paced financial environments.

Hands-on expertise in Microsoft Fabric, Azure Synapse and Azure Data Lake for cloud-based data analytics and Direct Lake, Direct Query for real-time reporting.

Advanced skills in Microsoft Excel (Pivot Tables, VLOOKUP, XLOOKUP, VBA) for ad-hoc data validation, reconciliation, and analytics.

Proficient in Microsoft Word and PowerPoint for documentation, executive presentations, and analytical storytelling.

Create and maintain source-to-target mapping documents, data flow diagrams, and technical specification documents

Experienced in Alteryx for data preparation, workflow automation, and blending large datasets from multiple sources.

Strong proficiency in SQL, Python, and R for data analysis, automation, and predictive modeling.

Experienced in integrating Power BI with Power Apps, Power Automate, Databricks, Informatica and Snowflake to deliver end-to-end business intelligence solutions.

Adept at leveraging AI-powered visuals (Key Influencers, Decomposition Tree, Smart Narratives, Anomaly Detection) to uncover insights and trends.

Proven ability to work in Agile, Scrum environments with strong collaboration, requirement gathering, and stakeholder management skills.

Adept at automating workflows with Power Automate and PowerShell, integrating APIs, and ensuring scalability and performance in BI solutions.

Solid experience implementing Row-Level Security (RLS) and security groups to enforce data access restrictions in Power BI.

Perform data profiling, reconciliation, and root cause analysis to ensure data accuracy and completeness

Worked with in defining the transformation logic in the ELT (Azure Data factory), SSIS and Informatica.

Experience in testing and writing SQL statements Stored Procedures, Functions, Triggers, and packages proficient in Snowflake, Teradata, SQL Server, Oracle etc.

Proficient in Building Analysis Services reporting models, developing visual reports, KPI scorecards, and dashboards using Power BI desktop.

Skilled in workflow automation using Power Automate and PowerShell, integrating APIs, and optimizing enterprise BI solutions.

Experience in writing complex SQL Queries using stored procedure, common table expressions (CTEs) temporary table to support Power BI.

Provide hands-on training and documentation for business users during UAT and post-go-live phases

Used Collibra (or Collibra-style data governance tools) to maintain a centralized repository of data definitions, lineage, and ownership information.

Assisted in data curation workflows, ensuring critical data elements (CDEs) were properly defined, reviewed, approved, and published.

Proficient in BI ecosystems including Power BI, Tableau, and SSRS, with expertise in KPI dashboards, scorecards, and advanced data visualization.

Knowledge of HIPAA and GDPR compliance standards in handling sensitive healthcare and financial data.

Created reports in Power BI preview portal utilizing the SSAS Tabular via Analysis connector.

Proficiency in different visualizations in the reports using custom visuals like Bar Charts, Pie Charts, Line Charts, Cards, Slicers, Maps etc. Also, using different transformations inside Power Edit Query into clean-up the data.

Technical Skills:

Business Intelligence & Visualization: Power BI (Desktop, Service, Dataflows, DAX, Power Query M), Tableau, Power Pivot, Power View, Power Map, Excel (Pivot Tables, XLOOKUP, VBA), Power Automate

ETL / Data Integration: Alteryx, SSIS, Informatica, Azure Data Factory, Azure Synapse, Snowflake, SQL Server, Oracle, Teradata

Databases & Query Languages: SQL, T-SQL, PL/SQL, MDX, DAX

Programming & Scripting: Python (Pandas, NumPy, Matplotlib), PowerShell, VBA

Cloud & Data Platforms: Microsoft Fabric, Azure (Logic Apps, Functions, Storage, Service Bus Queues), AWS (Basics), GCP (Basics)

Data Modeling & Governance: Star/Snowflake Schema Design, Data Mapping, Row-Level Security (RLS), Data Quality Management, Data Migration, GDPR & HIPAA Compliance

Project Management & Collaboration: Agile, Jira, Confluence, Microsoft Visio, Word, PowerPoint

Professional Experience:

Client: Fifth Third Bank, Ohio, USA

July 2024-Till Date

Role: Sr. Data Analyst

Responsibilities:

Involved in sprint planning sessions and sizing the user stories in Agile environment.

Used DAX (Data Analysis Expressions) & MDX functions for the creation of calculations and measures in the Tabular Mode & multi-dimensional Cubes.

Worked with Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage (GCS), and Compute Engine to support data analytics and reporting workflows.

Configured Power Automate workflows for scheduled dataset refreshes, minimizing manual intervention and ensuring up-to-date reports.

Coordinated with external data vendors and internal data providers to resolve data quality issues

Partnered with Data Stewards and SMEs to identify data risks, classify sensitive data, and ensure regulatory compliance (GDPR, HIPAA, SOX).

Expertise in creating Business Requirements Documents (BRD), Functional Specifications (FSD), and Technical Specifications (TSD)

Designed and maintained golden records by consolidating data from multiple source systems and resolving duplicates using data matching and survivorship rules.

Led Master Data Management (MDM) initiatives to standardize and govern critical entities such as Customer, Product, Account, and Location data across multiple enterprise systems.

Hands-on experience with RSA Archer GRC platform navigation, configuration, and module customization for risk and compliance initiatives.

Improved governance efficiency by streamlining metadata approval processes and reducing manual effort in data documentation tasks.

Strong hands-on experience with source-to-target mapping (STM) documents

Partnered with business units to translate vague stakeholder questions into structured problem statements and testable hypotheses aligned with strategic KPIs.

Conducted comprehensive exploratory data analysis (EDA) on large-scale financial datasets to uncover key business insights, detect anomalies, and inform strategy.

Integrated data from GCS and BigQuery into Power BI for enterprise reporting and advanced analytics.

Translated complex data findings into clear, actionable recommendations for stakeholders in financial services and banking operations through interactive dashboards and reports.

Utilized internal systems and data pipelines to streamline reporting processes, significantly improving accuracy and reducing turnaround time.

Created and maintained metadata documentation, including data definitions, business rules, source systems, and transformation logic.

Contributed to building and maintaining an enterprise data catalog to document data assets, business definitions, ownership, and usage details.

Collaborated across departments to support cross-functional initiatives aimed at improving financial forecasting and risk management.

Configured Archer use cases, custom fields, workflows, and questionnaires to support enterprise risk management processes.

Provided user training sessions on new dashboards, data models, and reporting tools

Performed data validation, cleansing, and optimization on GCP-hosted datasets using Python and SQL.

Applied mathematical models and statistical techniques to evaluate investment performance, lending activity, and customer behavior trends.

Strong understanding of GRC frameworks (IRM, ERM, Compliance, Audit & Controls) in regulated environments.

Designed controlled experiments and A/B tests to validate product, process, and marketing hypotheses with statistically significant lift measurement.

Partnered with IT teams to optimize data extraction processes from legacy systems, enhancing the reliability and scalability of analytical workflows.

Supported risk assessments, control testing, and compliance tracking initiatives for enterprise systems.

Maintained a high level of thoroughness and consistency in data validation and documentation to ensure regulatory and compliance standards were met.

Regularly communicated findings with both technical and non-technical stakeholders, facilitating data-informed decisions through clear verbal communication and visual storytelling.

Experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, and ETL using SSIS and Alteryx workflows.

Designed controlled experiments and A/B tests to validate product, process, and marketing hypotheses with statistically significant lift measurement.

Implemented row-level security and Star, Snowflake schemas, optimizing data models for scalability and performance.

Created basic interactive dashboards using Power BI to visualize data from Excel, SQL Server, and Microsoft Fabric sources.

Converted functional requirements into technical specifications and performed required data analysis, data mapping, functional testing, unit testing, and test case preparation using required tools based on project needs.

Implemented data standardization, de-duplication, and validation rules to improve master data accuracy, consistency, and reliability.

Used Power BI, Power Pivot to develop data analysis prototypes, and used Power View and Power Map to visualize reports.

Configured Power Automate workflows and Power Apps solutions for user-driven report input forms, automating data collection and feedback loops.

Integrated Power BI dashboards with Azure Data Lake Storage Gen2, optimizing access to raw and curated datasets for near-real-time insights.

Designed and automated GRC workflows for risk lifecycle management including intake, review, approval, and resolution.

Implemented scheduled data refreshes using Power Automate and Data Lake triggers for continuous data ingestion.

Collaborated with product teams to embed Power BI visuals within Power Apps interfaces for operational monitoring.

Developed multi-stage workflows with conditional logic, user assignments, and escalation rules.

Developed predictive models such as linear/logistic regression, random forest, and gradient boosting to forecast customer behavior, financial outcomes, and process performance.

Leveraged Excel (Pivot Tables, VLOOKUP, XLOOKUP, VBA) for preliminary analysis, QA validation, and reconciliation of Power BI datasets.

Designed modular, version-controlled SQL transformation layers in DBT (staging, fact, dimension, and mart models) to support repeatable analytics.

Implemented DBT tests for schema consistency, unique keys, freshness checks, and referential integrity to ensure high data quality.

Developed Informatica mappings for ETL transformations before loading into Azure Synapse and Power BI.

Implemented row-level security (RLS) for user-specific data access and ensured compliance with HIPAA and GDPR regulations.

Spearheaded migration of legacy SSRS reports to Power BI, improving visualization capabilities & reducing maintenance overhead.

Integrated large datasets from Snowflake, Azure SQL, and Oracle into Power BI for unified analytics and reporting.

Used Power BI to develop data analysis prototypes and used Power View and Power Map to visualize reports.

Analyzed covid datasets, built visualizations and recommended insights for business using Python, SQL, and Power BI

Implementing Azure Logic Apps, Azure Functions, Azure Storage, Service Bus Queues for enterprise-level systems.

Performed GAP analysis between current risk processes and desired GRC framework standards.

In Power BI developed reports by different time intelligence like year to data (YTD), Month to date (MTD), same period last year.

Documented requirements, user guides, and sprint deliverables using Microsoft Word, presented analysis insights to leadership through PowerPoint decks.

Participated in sprint planning and backlog refinement sessions, tracking development tasks and user stories in Jira.

Used Jira dashboards and reports to monitor progress, identify blockers, and align with stakeholders on project milestones.

Identified system process gaps and recommended mitigation strategies and control improvements.

Ensured reporting and data processes aligned with SOX, GDPR, HIPAA, and internal governance requirements.

Environments: Power BI Desktop, Power BI Service, Oracle, Alteryx, Informatica, Azure Synapse, Microsoft Fabric, SQL Server, Snowflake, Excel (Pivot Tables, VBA), Power Automate, Python, Tableau, Azure Logic Apps, Azure Functions, Service Bus, Microsoft Word, PowerPoint, Jira

Client: The Hartford, Connecticut, USA

April 2023-June 2024

Role: Data Analyst

Responsibilities:

Worked with Business users to gather requirement specifications for new dashboards.

Communicating with the users about their requirements, converting the requirements into functional specifications and developing advanced visualizations.

Supported user access control, role-based permissions, and data security settings in RSA Archer.

Performed data preprocessing, feature engineering, and model validation using cross-validation, confusion matrix, ROC/AUC, and lift analysis

Defined analytical scopes, success metrics, constraints, and assumptions before solution development to ensure measurable business outcomes..

Designed and enforced data governance workflows for master data lifecycle management, including onboarding, change requests, approvals, version control, and retirement of master records.

Designed and enforced data governance workflows for master data lifecycle management, including onboarding, change requests, approvals, version control, and retirement of master records.

Built and maintained relationship mappings between business processes, data domains, and technical assets, enabling impact analysis for data changes and regulatory reporting.

Partnered with business and data engineering teams to define MDM policies, ownership models, and data stewardship procedures.

Established end-to-end data lineage from source systems to BI and reporting layers, improving data transparency and traceability.

Created and maintained Archer dashboards and reports to visualize risk posture, compliance status, and control effectiveness.

Implemented basic security and access controls (IAM roles) to ensure secure access to GCP datasets.

Assisted in designing cloud-based data pipelines on GCP using scheduled loads and automated data ingestion processes.

Designed and published Tableau dashboards alongside Power BI reports for executive stakeholders.

Used Python extensively for data wrangling, feature engineering, and automating report validations.

Migrated legacy ETL logic from SSIS to Informatica and Alteryx to streamline data integration.

Wrote DAX expressions to create new measures, calculations per business requirements.

Expert level experience in MS Excel including Power Query, Power Pivot, Lookups and Pivot Tables, and strong expertise in writing SQL queries and BI reporting.

Built structured workflows to support policy management, issue management, and exception handling processes.

Published the developed dashboard, reports on the Power BI Services so that the end-users can view the data.

Participated in the configuration of an on-premises Power BI gateway to refresh datasets of Power BI reports and dashboards.

Monitored and improved master data quality using profiling, exception reporting, and automated remediation workflows.

Conducted hypothesis testing with p-values, significance thresholds, and confidence intervals to ensure reliable decision-making.

Assisted in identifying and managing operational, compliance, and third-party risks.

Built Excel-based tools using Pivot Tables, VLOOKUP, XLOOKUP, and VBA macros for quick insight generation.

Write and optimize complex T-SQL queries to extract, transform, and load (ETL) data from diverse data sources into Power BI for analysis and visualization.

Enhanced visibility by linking workflows with dashboard reporting and audit trails.

Built multifunction readmission reports using python pandas and Django framework.

Wrote standard & complex SQL Queries to perform data validation and graph validation to make sure test results matched back to expected results based on business requirements.

Wrote complex SQL queries in database Access and created various joins (inner, outer joins) to fetch the desired output for Data Analysis.

Developed Power BI, Power Apps integrated dashboards allowing business users to trigger workflows directly from reports.

Assisted in Archer platform enhancements, data imports/exports, and system optimization for improved performance and usability.

Supported data discovery and data literacy initiatives by enabling easier access to certified and trusted datasets.

Performed data validation, cleansing, and optimization on GCP-hosted datasets using Python and SQL.

Conducted impact analysis for regulatory and policy changes affecting data and reporting systems.

Calculated sample size, power, confidence intervals, and test duration to ensure valid experiment design.

Pulled and processed structured, unstructured data from Azure Data Lake and Snowflake for downstream analytics.

Published dashboards and managed refresh schedules in Power BI Service using Data Lake as backend storage.

Created data storytelling presentations in PowerPoint for executive reporting and decision-making sessions.

Ensured compliance with HIPAA and GDPR data standards in all reporting and user-access management processes.

Automated routine administrative tasks using PowerShell scripts to improve deployment efficiency.

Used various sources to pull data into Power BI such as SQL Server, Excel, cloud, SQL Azure.

Used DAX (Data Analysis Expressions) functions for the creation of calculations and measures in the Tabular Models.

Created effective reports using visualizations such as Bar chart, Clustered Column Chart, Waterfall Chart, Gauge, Pie Chart, Tree map, KPI in Power BI.

Strong experience on connecting various Data Sources in Power BI.

Created various data modeling in Power BI and linking the tables using various dimensions.

Reverse engineered SSRS reports and converted them into Power BI reports.

Leverage a broad stack of technologies like Python, Docker, AWS, Airflow, and Spark to reveal the insights hidden within huge volumes of numeric and textual data.

Logged, tracked, and resolved issues in Jira to ensure smooth project delivery.

Collaborated with QA teams using Jira workflows to manage bug tracking, testing, and resolution.

Developed and maintained multiple Power BI dashboards, reports and content packs.

Scheduled Automatic refresh and scheduling refresh in Power BI Service.

Environment: Power BI Desktop, Power BI Service, Tableau, Alteryx, Informatica, SQL Server, Snowflake, Excel (Pivot Tables, XLOOKUP, VBA), Python, PowerShell, SSRS, SSIS, Azure Cloud, Microsoft Word, PowerPoint, Jira

Client: Bafna Pharmaceuticals Ltd, Chennai, India March 2021-Dec 2022

Role: BI Developer (Power BI, Tableau)

Responsibilities:

Involved in designing physical and logical data models using Erwin Data modeling tools.

Designed the relational data model for operational data store and staging areas, Designed Dimension & Fact tables for data marts.

Conducted root-cause analysis using data segmentation, trend evaluation, and scenario modeling to explain what is happening, why it is occurring, and recommended actions.

Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.

Developed complex SSAS cubes with multiple fact measures and multiple dimension hierarchies based on the OLAP reporting needs.

Analyzed experiment output using statistical tests such as t-tests, chi-square, ANOVA, and Bayesian evaluation to quantify impact.

Built Tableau dashboards for pharmaceutical sales trends and compliance reporting.

Leveraged Informatica workflows for ETL processes across Oracle and Teradata environments.

Wrote automation scripts in Python to validate data quality and generate ad-hoc reports.

Developed, deployed, and monitored SSIS Packages including upgrading DTS to SSIS.

Responsible for identifying and defining Data Source and Data Source views.

Extensively utilized ETL to transform data from source files (Flat files and DB2) and load the data into the target database.

Built analytical frameworks to quantify effect size, variance, causality, and sensitivity across business processes and product performance.

Built and executed BigQuery SQL queries for data extraction, transformation, aggregation, and validation.

Integrity constraints, database triggers and indexes were planned and created to maintain data integrity and to facilitate better performance.

Used Advanced Querying for exchanging messages and communicating between different modules.

System analysis and design for enhancements Testing Forms, Reports and User Interaction.

Deployed ML pipelines in Python using libraries such as Pandas, Scikit-learn, and Statsmodels for repeatable and scalable analysis.

Utilized Google BigQuery to run large-scale analytical queries on structured and semi-structured datasets for reporting and trend analysis.

Integrated DBT with data warehouse environments (e.g., Snowflake, Synapse) to streamline ELT pipelines and reduce manual transformation effort.

Automated documentation and lineage tracking using DBT to provide transparency and visibility across data models and dependencies.

Environment: Power BI, Tableau, Alteryx, Informatica, Oracle, Teradata, SQL Server, Excel (Pivot Tables, VBA), Python, SSIS, SSAS, PL/SQL, Erwin, TOAD, Microsoft Word, PowerPoint

Education:

Master of Science in Computer and Information Science

Southern Arkansas University, Magnolia, Arkansas



Contact this candidate