Post Job Free

Resume

Sign in

Data Analyst Analysis

Location:
Charlotte, NC
Salary:
80-90k
Posted:
February 13, 2024

Contact this candidate

Resume:

Location: Overland Park, KS Email: ad3lxv@r.postjobfree.com Phone: 913-***-****

SUMMARY:

Accumulated around 4 years of in-depth experience in Data Analysis, primarily involved in the design, development, and implementation of Data Warehouse applications. Specialized in data modeling, ETL processes, and data visualization, with a strong emphasis on data-driven decision-making and business intelligence.

Adapted at analyzing data with Python's many libraries, including Matplotlib for data visualization, NumPy for numerical computation, and Pandas for data manipulation. Moreover, adapted at using R's statistical features for in-depth modeling and analysis. Skilled in using Java create data-driven apps, which helps create dynamic and interactive data solutions.

Proficiency in Microsoft 365 apps is demonstrated by utilizing tools such as Excel for sophisticated data processing and Power BI for generating dynamic dashboards and reports.

Comprehensive understanding of SQL, encompassing MySQL, facilitates the retrieval of significant insights from relational databases. Moreover, proficient with NoSQL databases, particularly MongoDB, guaranteeing effective data extraction, transformation, and analysis in a variety of data settings.

Skilled in creating and executing intricate data pipelines, guaranteeing the smooth integration of various information for thorough analysis. Knowledgeable about various project management techniques, such as Scrum, Agile, and Waterfall SDLC, and able to provide strategic planning and flexibility to joint data projects. TECHNICAL SKILLS:

Languages: SQL, Python & R.

Visualization tools: Tableau, Power BI.

Database: MySQL, MongoDB, SQL server

Python Libraries/Packages NumPy, Matplotlib, Scikit-learn, Pytorch Cloud Technologies: AWS (RDS, EC2, S3).

Methodologies: Agile (Scrum), SDLC & Waterfall.

Data Analytics Techniques: Data Warehousing, Data Cleaning, Data Manipulation and Data Visualization WORK EXPERIENCES:

Henry Schein, KS Jun 2023- Current

Data Analyst

Implemented classification models using Sci-kit Learn, achieving an 85% accuracy rate, which improved decision- making processes in the healthcare sector.

Leveraged Python to conduct advanced data analysis, graphical data exploration, and outlier detection, contributing pivotal insights for strategic decision-making.

Led the development of federal reporting standards, emphasizing Medicare Advantage data analysis. Played a critical role in managing data and supporting the transition from traditional software development to Lean/Agile and Scrum approaches, thereby improving data analysis workflows for increased productivity.

Managed and optimized AWS EC2 instances, enhancing infrastructure efficiency and performance by 20%, resulting in more robust and scalable application deployment.

Engineered dynamic data tools with Microsoft Access and crafted visually compelling data presentations using Tableau. Skillfully delivered impactful data visualizations through PowerPoint seamlessly integrated with Tableau, ensuring effective communication of key insights.

Knowledge in SQL, DML, and DDL, writing complex SQL queries and testing scripts for data validation against relational Databases including Data warehouse systems.

Extensive experience in Text Analytics, developing different Statistical Machine Learning, Data Mining solutions to various business problems and generating Data Visualizations using R and Python.

Involved in conducting presentations of the Q/A test results with analysis to the stakeholders and users and documented modifications and requirements.

Uber, MO Jan 2023-May 2023

Data Analyst Intern

Streamlined data cleaning and preprocessing with Pandas and NumPy, reducing data processing time by 20% in finance projects.

Administered AWS RDS for relational database solutions, achieving a 30% improvement in database performance and reliability, while ensuring streamlined data management.

Utilized MS SQL Server and MySQL platforms for effective data storage, transformation, and querying, streamlining data analysis and reporting processes.

Create Data Quality Scripts using SQL to validate successful data load and quality of the data. Create various types of data visualizations using Python and Power BI.

Expert in using Power BI to create dynamic and eye-catching data visualizations that improve stakeholders' ability to understand complex information.

Enhanced forecast accuracy by 32% through analytics in R. Exhibits expertise in Text Analytics, developing innovative Statistical Machine Learning solutions for diverse business challenges. Proficient in generating impactful Data Visualizations with R for informed decision-making.

Engaged in the strategic design of A/B tests, meticulously defining metrics to validate new user interface features. Proficient in calculating sample sizes and rigorously assessing statistical assumptions to ensure the reliability of test outcomes.

Adani, India Feb 2019- Nov 2021

Data Analyst

Created reporting dashboards and conducted data mining and product analytics to gain insights into customer purchase behavior, resulting in a 15% increase in sales.

Developed real-time dashboards in Tableau to visualize and monitor key metrics, leading to a 20% improvement in data-driven decision-making.

Leveraged AWS S3 for effective data storage and retrieval, enhancing data accessibility and backup solutions by 25%, leading to more efficient data management practices.

Performed data cleaning and processing for third-party spending data, utilizing Excel macros and Python libraries

(NumPy, Pandas and Matplotlib) to ensure data accuracy and efficiency, reducing processing time by 25%.

Executed data cleansing and staging of operational sources using ETL processes, resulting in improved data quality and streamlined data pipelines, reducing data errors by 30%.

Designed MySQL table schemas and implemented stored procedures for customer purchase and session data, leading to a 20% improvement in data retrieval speed.

Conducted exploratory data analysis using Matplotlib and Seaborn, resulting in the identification of key insights and actionable recommendations that improved click-through rates by 15%.

Applied advanced statistical analysis in R for hypothesis testing and regression analysis, leading to a 10% increase in click-through rates and a 12% boost in sales.

Proficiently managed NoSQL databases like MongoDB, leveraging their flexible schema design and efficient data retrieval capabilities to handle large volumes of structured and unstructured data, resulting in improved data storage and retrieval efficiency by 25%.

Designed and implemented interactive dashboards and storyboards in Tableau to explore credit portfolio data, providing valuable insights to management.

Conducted thorough EDA, identifying significant data patterns, and translating them into actionable business rules for informed decision-making.

Developed sophisticated SQL queries with multiple joins, subqueries, and common table expressions to aggregate data, ensuring accuracy in reporting.

EDUCATION:

Master of Science in Computer Science, 2023

University of Central Missouri, Warrensburg, MO

Bachelor of Technology Computer Science, 2019

Jawaharlal Nehru Technological University (JNTUH), Hyderabad, INDIA PROJECTS:

Clubhouse Data Visualization

Managed the large dataset by partitioning it into clusters. Cluster partitioning was based on the number of followers a user has.

Determined new users created each day, presented using a line graph, executed through Plotly graphing library.

Used Plotly library for visualizing data such as top users who invited others, number of followers vs following graph, and users with active social media accounts on Twitter and Instagram. Telecom

Predicted Telecom’s churn, in order to make special offers to customers, trying to keep them from leaving the company using Pandas, Matplotlib, and Seaborn in Python.

Identified the top causes of churn and created solutions in order to reduce churn by 10%.

Tools used: Python, Pandas, Matplotlib, Seaborn, Jupyter Notebook, Tableau. Data Visualization: Power BI Interactive Dashboard for Airline Delays

Airline dataset consisting of Airline Delays, Carrier Delay, Weather Delay, and Late Aircraft Delay

Created various visualization models for data analysis and data insights using R Studio



Contact this candidate