Post Job Free
Sign in

Data Analyst

Location:
Flower Mound, TX
Posted:
May 07, 2025

Contact this candidate

Resume:

Bhanu Prakash

945-***-****

****************@*****.***

PROFESSIONAL SUMMARY

Highly motivated Business Intelligence Developer Data Engineer Data Analyst with over 4 years of experience designing and delivering end-to-end data and analytics solutions in cloud and on-premise environments. Strong expertise in building scalable data pipelines, data visualizations, and analytics platforms using modern tools like Python, SQL, Alteryx, Databricks, Tableau, Power BI, AWS, and GCP.

Involved in various phases of Software Development Life Cycle (SDLC) including Analysis, Design, Testing, Implementation and Maintenance.

Extensive experience with Tableau, Google Analytics, Looker Studio and Power BI for building interactive dashboards, executive reports, and data stories using LOD expressions, calculated fields, parameters, sets, groups, and filters.

Experience in working with Snowflake, Mongo DB, PowerBI workspaces, PowerBI paginated reports, Embedding PowerBI and Tableau reports to UI.

Managed Tableau Server including user access, site management, content governance, and performance optimization.

Performing Data validation, Data blending, Action filters using SQL queries and creating empty extracts to transfer performance load on Tableau servers instead of Tableau desktop.

Expertise in Data Profiling, Extraction, Transformation, Loading, and Visualization using Tableau

Administered Power BI Service, overseeing security, workspace management, usage monitoring, and tenant-level configuration.

Performed advanced analytics using DAX, M-query, and complex data modeling.

Developed scalable ETL/ELT pipelines using Databricks (PySpark), AWS Glue, and GCP Dataflow.

Built cloud-based data lakes and warehousing solutions using AWS Redshift, S3, Athena and GCP BigQuery.

Proficient in implementing data workflows using BigQuery, Dataflow, and Cloud Composer enabling scalable and cost-effective cloud-native data solutions

Built Data integration solutions, workflow solutions and ETL solutions for Data warehouse both Cloud and On-Prem using Datastage, DataManager, CP4D, SSIS, Databricks,Tableau Prep and Alteryx, integrating structured and semi-structured data .

Extensive work with data from SQL Server, Snowflake, Oracle, Teradata, MongoDB, Hadoop, JSON, Excel, flat files.

Excellent knowledge in RDBMS concepts and constructs along with Database Objects creation such as Tables, User Defined Data Types, Indexes, Stored Procedures, Views, User Defined Functions, Cursors and Triggers.

Experience in writing common table expressions (CTEs) and user defined functions, indexed views and DML/DDL triggers for data consistency and data manipulation.

Expertise in performance tuning, fine tuning SQL queries for improved performance and Query Optimization.

Built scalable ETL workflows using Databricks and Python, processing large datasets and optimizing performance using PySpark and Delta Lake for data storage.

Developed end-to-end data pipelines from extraction to transformation and loading into cloud-based data warehouses using Python for automation and Databricks for distributed processing.

Utilized Apache Spark within Databricks to process big data efficiently, writing PySpark code for complex transformations and aggregations.

Built real-time data processing applications in Databricks using PySpark Streaming, leveraging Databricks for seamless handling of streaming data and real-time analytics.

Designed and implemented cloud-based data architecture using Databricks, AWS, and Python for processing, storing, and analyzing large datasets in the cloud.

Developed machine learning models within Databricks using Python libraries and integrated them into production data pipelines for automated predictions and decision-making.

CERTIFICATIONS

Tableau Desktop Specialist

AWS Certified Solutions Architect – Associate

EDUCATION

Master’s In Computer Science - Campbellsville University 2022 – 2024

Bachelor’s In Information Technology – VVIT 2018 – 2022

TECHNICAL SKILLS

Technologies

Tableau, Power BI, Looker Studio, Google Analytics, LookML, SAS Visual Analytics

ETL Tools

Alteryx Designer, IBM DataStage, IBM Cloud Pak for Data (CP4D), Tableau Prep, Apache Spark (PySpark), Databricks, GCP Dataflow, AWS Glue, Informatica

Database

SQL Server, Oracle, Snowflake, Teradata, MongoDB, Hadoop/Hive, BigQuery, Snowflake, SAP HANA, Maria DB

Cloud

AWS (S3, Glue, Redshift, Athena, Lambda), GCP (BigQuery, Pub/Sub, Dataflow, Cloud Storage, Cloud Composer), Azure, Databricks, Snowflake

Programming

Python (Pandas, NumPy, scikit-learn,), T-SQL, PL/SQL, DAX, M-query, PySpark, Shell/Unix scripting

Other Tools

Git, JIRA, MS Office (Excel, Access, Visio), Power BI Admin Tools, SQL Server Management Studio (SSMS), Tableau Admin Views, Airflow, CloudWatch, Postman, Visual Studio Code

CORE COMPETENCIES

Data Warehouse Implementation

BI/ETL Solutions Design & Delivery

Report & Dashboard Creation

Analysis, Testing, & Defect Fixing

Requirement Gathering &

Modeling & Scripting

PROFESSIONAL EXPERIENCE

CROSS SENSE ANALYTICS

Client: AT&T, TX Nov 2023 – Present

Role: Data Analyst

Responsibilities:

Designed and developed advanced Tableau and Power BI dashboards to support analytics initiatives for the Enterprise Protection Plan, enabling business users to access real-time insights.

Collaborated with business stakeholders to gather and document requirements, translating legacy reporting systems into modern cloud-native BI solutions.

Led efforts in data migration and testing, transitioning data pipelines and reports from Teradata to Snowflake, ensuring accuracy and performance.

Partnered with clients to create Proofs of Concept (POCs) for analytical use cases, aligning business questions with technical solutions.

Actively contributed throughout the SDLC, working with ETL developers and DBAs to design new data models and schemas on cloud platforms.

Designed and maintained Tableau dashboards and reports using data stored in Snowflake, hosted initially on Azure cloud infrastructure.

Collaborated with cloud architects and data engineers to plan and execute the migration of Tableau Server and Snowflake workloads from Azure to AWS with minimal downtime.

Participated in the reconfiguration of data connections, authentication methods (OAuth, SAML), and extract refresh schedules during the cloud migration from Azure to AWS.

Updated Tableau data sources and workbooks to reflect changes in Snowflake endpoints and schemas post-migration.

Conducted performance benchmarking and optimization of Tableau dashboards pre- and post-migration to ensure consistent performance in AWS.

Performed extensive data validation across source and target systems using tools such as Jama, MOVEit, and DBeaver to ensure data integrity and compliance.

Utilized JIRA and Mural for agile sprint planning and backlog tracking, ensuring timely delivery of analytics features.

Conducted performance tuning of data models and dashboards using industry best practices to optimize query efficiency.

Built and maintained complex SQL and Python scripts to perform data cleansing, enrichment, and aggregation from diverse sources such as Oracle, Teradata and Excel.

Created and orchestrated ETL pipelines using AWS Glue and Databrick supporting ingestion from on-premise and cloud sources.

Automated daily data processing tasks and ensured secure data movement using AWS S3, Azure Cloud Storage, and secure transfer tools.

Migrated legacy ETL logic from SSIS to modern cloud-based processing workflows using PySpark and Airflow on AWS Lambda.

Built reusable components and performed schema optimization in Redshift, Snowflake, enabling high-performance querying.

Developed and published Power BI dashboards with row-level security and dynamic filters, ensuring proper governance and controlled data access.

Supported senior leadership and executives with ad hoc data requests, delivering fast insights through curated Tableau dashboards and automated datasets.

Participated in data warehousing and OLAP initiatives, including building fact/dimension tables, star/snowflake schemas, and setting up aggregates.

Administered Power BI workspaces, managed refresh schedules, configured gateways, and set up report subscriptions and user permissions.

Translated business KPIs such as call volume, network usage, and customer plans into digestible reports and visualizations to guide product decisions.

ENVIRONMENT: Tableau, Power BI, Snowflake, AWS Glue, AWS S3, AWS Lambda, Redshift, Azure, Databricks, Oracle, Teradata, SQL, Python, PySpark, HiveQL, SparkSQL, Jama, DBeaver, MOVEit, JIRA, Mural.

Client: MD Anderson Cancer Center, TX Jan 2023 – Nov 2023

Role: Data Analyst

Responsibilities:

Worked with data related to programming and provider- and patient-driven kidney care solutions.

Spearheaded the end-to-end migration of the data environment from legacy platforms to Databricks, ensuring seamless transition and performance optimization.

Recreated and modernized existing data pipelines and tables using PySpark and Python in Databricks, importing datasets via .csv, federated queries, and advanced scripting techniques.

Re-engineered legacy ETL workflows by replacing outdated processes with Alteryx workflows for robust data preparation and transformation.

Developed and deployed interactive Tableau dashboards and Power BI reports to support provider and patient-driven kidney care solutions.

Developed AI/ML-powered endpoints for care navigation by designing models and integrating with front-end services to automate routine queries using prompt engineering and MLOps workflows.

Ensured compliance with all HIPAA guidelines, maintaining data security and integrity for sensitive patient and provider information.

Collaborated with healthcare providers such as Cigna, HNE, Molina, Cambia, FLB, and WellCare to track critical patient-doctor interactions and deliver analytics-based insights.

Performed extensive data validation, unit testing, and quality checks to meet business and clinical data requirements using GCP BigQuery, Databricks, and SQL.

Utilized Python scripting to automate Tableau data sources, refreshes, and execute stored procedures for back-end data management.

Conducted performance tuning, optimization, and ad-hoc troubleshooting on complex SQL queries and views to enhance reporting efficiency.

Created and maintained database objects (tables, procedures, views, indexes) in GCP BigQuery and Databricks SQL for analytics enablement.

Participated in daily Agile sprints, contributing to sprint planning, retrospectives, and user story refinement using JIRA.

Delivered over 120+ custom Tableau reports and enhanced user experience through dynamic parameters, calculated fields, and scheduled extract refreshes.

Acted as Tableau Server Administrator, managing user roles, permissions, data security, and content deployment across projects.

Continuously engaged in data issue root cause analysis by writing complex SQL queries and working closely with cross-functional teams.

Supported system integration testing (SIT) and user acceptance testing (UAT), ensuring accuracy and completeness of healthcare data models.

ENVIRONMENT: Tableau, Power BI, Alteryx Designer, Databricks, GCP BigQuery, GCP Cloud Storage, GCP Dataflow, Python, PySpark, SQL, Jupyter Notebooks, DBeaver, JIRA, CSV/Flat Files, Git, HIPAA-Compliant Environments, Visio.

Client: US Pharmacopeia, India Feb 2021 – June 2022

Role: Data Analyst

Responsibilities:

Designed and developed Tableau & Looker dashboards for visualizing pharmacological data, ensuring adherence to regulatory standards.

Built dynamic dashboards in Looker Studio, modeling LookML data to create insightful reports on corporate and leisure traveler patterns.

Integrated guest survey data with Tableau and Looker, allowing management to analyze customer feedback, identify pain points, and enhance guest experiences.

Led the migration of Pharmaceutical data Dashboards From Looker to Tableau ensuring a seamless transition while optimizing report performance.

Crafted interactive visualizations to effectively communicate complex pharmaceutical information to stakeholders.

Utilized Python scripts to automate data preparation processes, improving efficiency and accuracy in data handling for Tableau reports.

Performed data cleaning and transformation using Python libraries (Pandas, NumPy) to ensure high-quality datasets for Tableau visualizations.

Created advanced calculations and custom metrics in Tableau, leveraging Python for statistical analysis and predictive modelling.

Implemented Python-based solutions to integrate external data sources via APIs into Tableau, expanding data accessibility and insights.

Optimized Tableau workbooks and data sources using Python to enhance load times and improve overall dashboard performance.

Created and maintained data pipelines to support enterprise analytics and reporting.

Designed backend solutions to process and analyze marketing and budget data efficiently.

Developed test automation suites to ensure data integrity and system reliability.

Reduced technical debt by refactoring and optimizing backend code.

Participated in code reviews and provided technical documentation for system improvements.

Built data validation frameworks to ensure accuracy and consistency across datasets.

Designed and implemented scalable data warehouses to support business intelligence needs.

Automated ETL processes to reduce manual intervention and improve efficiency.

Collaborated with stakeholders to define and implement data governance policies.

Monitored and resolved system issues, ensuring high availability and reliability of backend services.

Utilized Tableau Server as a front-end BI tool and designed and developed workbooks, dashboards, global filter pages, and complex parameters-based calculations with Oracle, DB2, Big Query, and Microsoft SQL as back-end databases.

Designed an interactive Tableau dashboard to monitor drug efficacy trends over time.

Provided stakeholders with real-time insights, contributing to informed decision-making in drug development.

Scheduled and facilitated all aspects of the Scrum framework, including sprint planning sessions, backlog grooming, daily scrums, product demos, and sprint retrospectives.

Maintained the Scrum team’s capacity plan, Scrum board, sprint backlog, velocity charts, and burn-down charts

Created groups for PHI and NON-PHI access and implemented at report level security to secure PHI data as per HIPAA standards.

ENVIRONMENT: Tableau Desktop, Tableau Prep, Looker Studio, LookML, SharePoint, Excel, Teradata, Snowflake, Tableau Servers, Microsoft Azure SQL Data Warehouse, Alteryx, Python, Data Bricks.



Contact this candidate