Post Job Free

Resume

Sign in

Data Analyst Power Bi

Location:
Cedar Park, TX
Posted:
April 24, 2024

Contact this candidate

Resume:

Navya Y

Ph:224-***-****

Email ID: ad4808@r.postjobfree.com

LinkedIn Tableau

SUMMARY:

Professional qualified BI Developer/ Data Scientist/ Data Analyst with around 6 years of experience in Data Science and analytics and domain expertise in Telecom, IT Consulting, Healthcare Industries.

Extensively worked on various business intelligence tools like Tableau, Power BI, Domo and AWS Quick Sight for building various interactive dashboards, report generation

Expertise in Python and SQL for developing robust prototypes and data-driven solutions.

Skilled in collaborating on open-ended projects, thriving in ambiguous situations, and uncovering hidden insights.

Demonstrated proficiency in AWS, specifically Amazon RDS, to deploy, manage, and optimize relational databases, enhancing Tableau connectivity and overall system performance.

Demonstrated proficiency in Microsoft Power BI for data visualization, leveraging its robust features to create insightful and visually compelling reports and dashboards.

Hold expertise in automate recuring reports in SQL and Python and using them to visualize reports and interactive dashboards on BI platforms like Tableau and Power BI

Conducted thorough analysis of user interaction data, identifying patterns and trends for strategic decision-making.

Leveraged a strong Power BI, Tableau development skills to create visually impactful and interactive dashboards, meeting, and exceeding business requirements for data visualization.

Proficient in implementing best practices for data visualization design, encompassing layout, color theory, and interactivity. Skilled in creating visually compelling and intuitive data representations that enhance comprehension and insight.

Developed and maintained databases and data systems, reorganizing data into a user-friendly format for enhanced accessibility.

Developed comprehensive data flow diagrams to visualize the flow of information within complex systems, facilitating clear communication and understanding of data processes among cross- functional teams.

Highlighted proficiency in SAS programming for data manipulation, statistical analysis, and report generation, showcasing how SAS skills enhanced the efficiency and depth of the analysis process.

Provided code snippets or examples illustrating the use of SAS procedures and functions alongside Python code to demonstrate the synergy between the two platforms in addressing the project objectives.

Played a pivotal role in systematizing data processes by meticulously crafting data flow diagrams, which significantly contributed to the clarity and coherence of research projects undertaken within the academic department.

Engineered Python-based applications to update and extend local databases, ensuring data integrity and scalability while optimizing data management processes.

Utilized expertise in Postgres SQL for data extraction, transformation, and loading (ETL) processes, ensuring seamless integration of diverse data sets into Tableau for comprehensive analysis.

Strong skills in statistical methodologies such as A/B testing, ANOVA, hypothesis testing

Skilled in performing data cleaning, preprocessing, data visualizations and feature engineering in Python and R

Developed comprehensive reports utilizing Microsoft SQL Server Reporting Services (SSRS) to provide actionable insights to key stakeholders.

Designed and implemented complex SQL queries to extract and manipulate data from relational databases for report generation.

I have expertise in Python and data analysis libraries such as Pandas, NumPy, matplotlib, Seaborn, beautiful soup, Scikit- learn, etc.

Developed extensive data queries along with scripts to generate the data dump to excel spreadsheet in support of a data modeling effort.

TECHNICAL SKILLS

Programming Languages: SQL, Python (pandas, matplotlib, NumPy, SciPy, beautifulsoup, SQL, Alchemy,requests), R (rpart, ggplot2, tidyverse, dplyr, caret, plotly, XGboost), Java, HTML, CSS, C++, Bootstrap, JavaScript, AngularJS, Node Js, VBA, Linux, PL/SQL

ML Framework: Scikit-learn, EDA, A/B testing, model deployment, monitoring, and

optimization.

Cloud Platforms: Amazon Web Services (AWS), Azure or Google Cloud

ETL Tools: Informatica, Alteryx

Databases & Servers: Big Query, Redshift, Vertica, MS SQL Server, Azure SQL, Oracle, DB2, MySQL, PostgreSQL

IDE/ BI Tools: Tableau, Power BI, Alteryx, Snowflake, AWS Athena, Visual Studio, SQL (MySQL, PostgreSQL, SQLite), Adobe Analytics, AWS Athena, Redshift, JIRA, Airtable

Microsoft Tools: MS SQL Server, Advanced Microsoft Excel functions, MS Visual Studio, MS

Access, MS Word, MS PowerPoint, Microsoft SharePoint, Microsoft Power Automate

EDUCATION and CERTIFICATIONS:

Bachelor of Technology in Computer Science & Engineering, Amrita Vishwa Vidyapeetham, Bengaluru – 2018

Master’s in management information systems at University of Illinois at Chicago – Jan 2021 –

Dec 2022

Certified AWS Cloud Practitioner

Certified Tableau Desktop Specialist

PROFESSIONAL EXPERIENCE:

Client – United Health Group Oct 2023 – Present

Business Intelligence Developer Responsibilities:

Identified and rectified bottlenecks in claims processing workflows through detailed analysis and optimization strategies. This initiative led to a notable 15% increase in processing efficiency, streamlining operations, and reducing turnaround times.

Developed comprehensive dashboards in Tableau and reports tailored for claims processing optimization. Utilizing advanced analytics and visualization techniques, these tools provided actionable insights, resulting in a significant 30% improvement in claims resolution accuracy. The intuitive presentation of data facilitated informed decision- making and enhanced overall performance.

Leveraged expertise in backend SQL scripting to streamline healthcare insurance data management processes. Utilizing SQL, extracted, transformed, and preprocessed data from complex relational databases, ensuring data integrity and consistency throughout the processing pipeline.

Designed and implemented optimized SQL queries and stored procedures to retrieve relevant data for dashboard and report development. By fine-tuning database interactions, ensured efficient data retrieval and minimized processing overhead, contributing to the responsiveness and effectiveness of analytical tools.

Conducting basic research to support data analysis activities.

Identified and resolved data errors through structured analysis.

Developing basic data models to organize data for analysis.

Optimized long running Stored Procedures and Queries for effective data retrieval.

Client – Sling TV Dish Networks, Denver, CO Jan 2023- Aug 2023

Data Scientist Responsibilities:

Built an interactive dashboard in Tableau that showcases the usage and metrics of our newly launched feature, leveraging advanced calculations to provide actionable insights into user behavior.

Conducted exploratory data analysis on clickstream data using Python and built and interactive dashboard to identify the patterns, trends, anomalies, and the flow of usage, leading to actionable insights to understand the number visits in each tab, sign up or the number of playbacks initiated.

Built and presented an interactive dashboard that showcases the number of users who signed up for each package plan and their viewership across the view period, to understand the performance of each package launched.

Engineered a model for comprehensive clickstream analysis, identifying user behavior patterns, trends, and anomalies. Improved anomaly detection accuracy by 10%, providing valuable insights for optimizing the user experience.

Spearheaded the development and maintenance of comprehensive data mappings and transformations, focusing on ensuring data accuracy and integrity across multiple databases and data assets.

Implemented robust data transformation pipelines using ETL (Extract, Transform, Load) tools and scripting languages such as Python and SQL, enabling seamless data integration and alignment with business objectives.

Integrated approach to data management, there was a notable 15% improvement in data consistency and reliability, enhancing the overall quality and usability of the data assets for analytical and reporting purposes.

This initiative resulted in an 8% improvement in click-through rates and a 5% increase in user satisfaction, demonstrating proficiency in Python for data-driven experimentation and analysis.

Utilized Tableau to develop and maintain dynamic dashboards, showcasing key metrics such as the number of returning customers and top assets being watched, facilitating insights-driven decision- making for retention strategies.

Developed a dashboard in Adobe Analytics to showcase the number of returning customers and their top-viewed assets, facilitating the understanding and formulation of a retention strategy.

Implemented a predictive model within Adobe Analytics, leading to a 2% increase in adoption of the Picture in Picture Mode feature within three months of deployment, aligning marketing efforts with user preferences.

Enhanced the clickstream analysis model in Adobe Analytics, resulting in a 5% improvement in anomaly detection accuracy, enabling prompt issue resolution and optimization of the user experience.

Integrated machine learning algorithms into interactive Adobe Analytics dashboards, enabling real-time updates and improving insight accuracy by 10%, thereby enhancing decision-making processes.

Presented modeling results in Adobe Analytics in an accessible manner, contributing to a 12% increase in cross-functional collaboration and understanding of data-driven insights among team members.

Utilized Adobe Analytics alongside SQL and AWS for ad hoc analysis to segment customers based on device and behavior, producing insightful Power dashboards for visualization by the marketing team.

Collaborated with the programming team to establish a Tableau-based A/B testing reporting suite within Adobe Analytics, enabling measurement of paid subscription impact on testing channels.

Presented insights and findings from ad-hoc analysis to the team, contributing to data-driven decision

making.

Create ad-hoc reports as requested in a timely and accurate manner to provide actionable data to senior management and above, both internal and external to the departments.

Leveraged Machine Learning techniques to perform exploratory data analysis on clickstream data from AWS Athena. Uncovered usage patterns, trends, and anomalies, enhancing understanding of user behavior and flow to facilitate understanding of tab-visits, sign-ups, and playback initiations.

Employed Macros to automate reporting processes, reducing report generation time by 30% and enhancing the efficiency of data analysis.

Applied programming languages such as SQL, and VBA to extract, transform, and analyze data from relational databases, resulting in the optimization of data workflows and a marked enhancement in data accuracy.

Expertly managed ETL processes, scripting complex SQL queries for ad-hoc data retrieval and implemented data cleaning procedures to ensure data quality.

Pre-aggregated dashboard calculations at SQL layer to boost the dashboard performance by more than 40% supporting 4k+ users across the world.

Led cross-functional collaboration with development teams to implement a Tableau-based A/B testing reporting system, quantifying the impact of paid subscriptions on testing channels.

Cleaning and validating data using SQL.

Assisting with basic data analysis tasks such as descriptive statistics and data profiling

Client – United Health Group Minneapolis, MN

Aug 2021 – Dec 2022

Data Analyst Responsibilities:

Assisted the Lead Business Analyst in documenting User Stories, Business Requirement Document, Technical requirements & Non-Functional requirements.

Spearheaded exploratory data analysis (EDA) using Python to analyze healthcare insurance datasets, aiming to uncover insights into insurance demographics, coverage amounts, and prevalent ailments.

Utilized statistical techniques and data visualization libraries such as pandas, NumPy, matplotlib, and seaborn to identify trends, patterns, and anomalies within the datasets. Conducted in-depth analysis to extract actionable insights and inform decision-making processes.

Conducted analysis to determine the total number of insurance policies taken within the dataset. Utilized pandas to aggregate and count unique insurance policies, providing insights into the overall volume of insurance coverage.

Demonstrated proficiency in data manipulation and transformation using SAS procedures to ensure data quality and consistency before analysis.

Employed pandas to segment customers by age group and analyze their insurance coverage. Utilized matplotlib and seaborn to visualize the distribution of insured individuals across different age brackets, facilitating insights into the demographics of insured individuals.

Leveraged pandas categorize insured individuals based on the ailments they were dealing with. Employed seaborn to create categorical plots and analyze the prevalence of different ailments among insured individuals, identifying common health issues and their impact on insurance coverage.

Developed a comprehensive dashboard to analyze insurance demographics and recommend tailored insurance plans and marketing strategies targeting specific age demographics based on their insurance needs and preferences.

Utilized matplotlib and seaborn to create visualizations illustrating the distribution of insured individuals across different age groups.

Included pie charts or bar graphs to display the proportion of insured individuals within each age bracket, providing a clear understanding of the demographic composition.

The dashboard facilitated data-driven decision-making processes, enabling stakeholders to customize insurance plans and marketing strategies based on actionable insights derived from demographic analysis.

Resulted in improved customer engagement, enhanced satisfaction, and increased sales conversion rates by aligning insurance offerings with the specific needs and preferences of different age demographics.

Spearheaded initiatives to improve data accuracy by analyzing historical integration patterns and identifying areas for enhancement.

Implemented data cleansing and enrichment strategies, resulting in a measurable increase in the accuracy and completeness of integrated healthcare datasets.

Collaborated closely with business analysts and stakeholders to gain a comprehensive understanding of data integration requirements, ensuring alignment with organizational goals and compliance with industry standards.

Implemented stored procedures for automated data validation, significantly reducing manual intervention and improving data accuracy. transitioning from legacy systems to a modernized data integration platform.

Designed and developed PowerBI dashboards that effectively communicated complex data sets, ensuring alignment with business objectives.

Utilized Postgres SQL for data extraction, transformation, and loading, ensuring the availability of diverse data sources for Tableau analytics.

Integrated Tableau with AWS RDS, facilitating efficient and scalable access to cloud-hosted databases for enhanced data analysis.

Demonstrated proficiency in testing and validating dashboards, including Tableau, ensuring the seamless integration of analytics insights for end-users.

Leveraged SQL and Python to develop and execute comprehensive test cases, ensuring the integrity of data pipelines and analytics workflows, resulting in a 20% improvement in overall product reliability.

Conducted thorough testing of analytics dashboards, including Tableau, identifying, and addressing issues to improve overall user satisfaction.

Engaged with a culturally diverse and easygoing team, fostering a collaborative environment that positively impacted the quality of deliverables.

Created and maintained Tableau reports, visualizations, charts, and dashboards to convey insights effectively.

Performed data validation, comparing data from different tables, and resolved data discrepancies.

Utilized Jira to receive, track, and prioritize request tickets from business users, ensuring timely resolution.

Collaborated with cross-functional teams to understand data requirements and deliver solutions that met business objectives.

Worked with SQL Server databases to extract and transform data for reporting and analysis.

Demonstrated strong communication skills in explaining technical solutions to non-technical stakeholders.

Analyzed, investigated, and recommended efficient strategies for data shrinking and archiving in large databases, reduce storage costs.

Assisted other developers in resolving technical issues by providing expert guidance and training, promoting knowledge sharing and collaboration.

Created and updated interactive dashboards and reports using Tableau providing stakeholders with real time data visualizations and progress of our feature.

Client – University of Illinois, Chicago Jan 2021 – May 2021

Graduate Assistant Data Analyst Responsibilities:

Orchestrated Power Automate workflows to automate the generation and distribution of offer letters for Graduate Assistant roles.

Collaborated with faculty members to develop comprehensive data flow diagrams aimed at visualizing the intricate flow of information within complex systems.

These diagrams served as invaluable tools for enhancing communication and fostering a deeper understanding of data processes among interdisciplinary teams.

Provided hands-on guidance and support to students in understanding and interpreting data flow diagrams, empowering them to grasp the intricacies of complex systems and methodologies employed in academic research.

Leveraged backend SQL scripting to seamlessly integrate student information, ensuring accuracy and efficiency in the process.

Utilized Tableau and SQL to design interactive dashboards and reports for monitoring real-time ticketing traffic.

Implemented backend SQL scripting to extract and analyze ticketing data, tracking the volume of incoming tickets and their resolution status. This initiative facilitated proactive decision-making and resource allocation, resulting in optimized ticket resolution processes.

Played a pivotal role in crafting visualizations and dashboards using Tableau and SQL, effectively communicating data-driven insights to stakeholders. Received commendable feedback from faculty members, contributing to a notable 25% surge in dashboard utilization. These efforts fostered a culture of data-driven decision-making within the department.

Executed surveys and data collection activities for academic research projects, adhering to ethical standards and employing SQL for efficient data management. Achieved a noteworthy 20% increase in survey response rates through streamlined SQL-backed processes, ensuring timely and accurate data collection for research initiatives.

Delivered comprehensive training sessions on data analysis techniques and tools, with a focus on SQL, to faculty and fellow graduate students. This initiative resulted in a remarkable 30% improvement in data literacy within the department, empowering stakeholders to leverage data effectively for informed decision-making.

Client – FEDEX Data Analyst Pune, Hyderabad

Jan 2020 – Dec 2020

Responsibilities:

Collaborated with warehouse management team to identify key performance indicators (KPIs) and data sources for dashboard development.

Designed and implemented Power BI reports and dashboards to track warehouse metrics including inventory levels, order fulfillment rates, and shipping accuracy.

Utilized Power Query to extract, transform, and load (ETL) data from multiple sources including ERP systems, warehouse management systems (WMS), and Excel spreadsheets.

Implemented data refresh schedules to ensure dashboards provided up-to-date information to stakeholders.

Conducted regular performance tuning and optimization of Power BI reports for enhanced responsiveness and user experience.

Improved delivery time accuracy by 15% through real-time monitoring and analysis of route performance using Power BI dashboards.

Implemented SQL queries and stored procedures to aggregate and summarize delivery data, calculate distances, and identify deviations from optimal routes.

Integrated SQL Server Integration Services (SSIS) packages to automate data extraction, transformation, and loading (ETL) processes from multiple data sources into Power BI.

Provided training and support to warehouse staff on accessing and interpreting Power BI dashboards to facilitate data-driven decision-making at all levels.

Enhanced order fulfillment process by identifying bottlenecks and inefficiencies through Power BI dashboards, resulting in a 15% increase in order processing speed.

Improved shipping accuracy by 20% through the identification of common errors and proactive corrective actions based on insights from Power BI reports.

Extracted and transformed historical data from various sources including ERP systems, inventory databases, and transaction logs.

Led a Route Optimization Analysis initiative, leveraging Power BI for comprehensive visualization of delivery routes and pinpointing inefficiencies. Employed backend SQL scripting to extract, preprocess, and analyze transportation data, resulting in a notable 20% decrease in transportation costs.

Cleaned and standardized data to ensure consistency and accuracy for analysis purposes.

Conducted exploratory data analysis (EDA) using statistical techniques and data visualization tools such as Python (pandas, matplotlib) and R (ggplot2) to identify trends, seasonality, and anomalies in inventory levels and transactional data.

Applied time series analysis and forecasting models to predict future inventory demand and consumption patterns.

Leveraged clustering and segmentation techniques to classify inventory items based on demand variability, lead times, and criticality.

Collaborated with cross-functional teams including supply chain, procurement, and sales to incorporate business insights into inventory management strategies.

Developed actionable recommendations for optimizing inventory levels, safety stock, and reorder points based on data-driven insights and business objectives.

Implemented performance monitoring and tracking mechanisms to evaluate the effectiveness of inventory management strategies over time.

Reduced inventory holding costs by 15% through the implementation of dynamic inventory replenishment policies based on demand forecasts and lead time variability.

Minimized stockouts and backorders by 25% through the proactive identification of demand patterns and adjustment of safety stock levels accordingly.

Streamlined procurement processes by optimizing order quantities and supplier lead times based on historical consumption data and demand projections.

Enhanced customer satisfaction by ensuring product availability and on-time delivery through proactive inventory management practices.

Client – Mu Sigma

Bengaluru, IN Aug 2018 – Dec 2019

Business Solutions Decision Scientist (Intern/Part time role later converted to Full time) Responsibilities:

Conducted data wrangling and cleaning using tools like Excel, Python and R on client’s data containing 25000+ records.

Applied NLP techniques with Python (BeautifulSoup and Selenium) to scrape data from four websites, amassing a database of 9,000+ patient records. Analyzed post-procedure issues and built a word cloud of healthcare symptoms using scraped data, facilitating insights into patient experiences.

Created and maintained Dashboard in Tableau for highlighting the regional performances, customer count and other metrics for client, that lifted their ROI by 4 basis points and to help them prioritize their office regions and operation costs.

Scraped data using Python packages beautifulSoup and Selenium (for rendering dynamically loaded content) from 4 websites and collated a database of more than 9000 patients and their issues to identify the key major issues faced by them.

Orchestrated a data migration project involving the extraction of patient records from disparate sources, transformation to a standardized format, and loading into a new Electronic Health Record (EHR) system.

Demonstrated expertise in medical coding by leveraging knowledge of CPT, ICD-10, and HCPCS coding systems to analyze patient records and identify key healthcare issues.

Utilized CPT, ICD-10, and HCPCS codes to categorize and group patient data, enabling deeper analysis of medical procedures, diagnoses, and service levels.

Produced databases, queries, reports for analyzing, summarizing, and imputation of missing data using Excel, R and Alteryx. Implemented ETL process in Alteryx to extract data from multiple sources (SQL Server, Excel, CSV) and schedule workflows.

Conducted exploratory data analysis (EDA) in Python to uncover patterns, trends, and outliers in datasets. Applied sentiment analysis techniques to extract valuable insights from textual data, aiding in understanding customer sentiments and feedback.

Analyzed source data to identify data quality issues and developed strategies to address them, resulting in a 25% reduction in data errors and inconsistencies.

Developed and maintained ETL processes to support data warehouse and reporting requirements, resulting in a 30% improvement in data processing time and a 10% increase in data throughput.

Collaborated effectively with international teams to develop new survey concepts and designs.

Created documentation/training manual for new closing procedure for consolidated statements.

Collaborated with cross-functional teams to define project objectives, scope and deliverable ensuring alignment with business goals.

Designed, developed, assessed, and maintained Tableau functional reports based on user requirements.

Assisted in designing and management of SQL database schemas and tables.



Contact this candidate