Post Job Free
Sign in

Data Analyst Power Bi

Location:
Cleveland, OH
Posted:
October 01, 2024

Contact this candidate

Resume:

SRAVYA HANUMANULA

Location: OH Phone: 216-***-**** Email: ******.*******@*****.*** LinkedIn

SUMMARY

Experienced Data Analyst with over 3 years of expertise in structured and unstructured data, encompassing Data Mining, Acquisition, Validation, Predictive Modeling, and Visualization. Proficient in Python libraries like Pandas, NumPy, Seaborn, and Matplotlib, I excel in data cleaning, visualization, and model construction. Skilled in SQL for temporal data analysis and trend identification, I also create impactful visualizations using Tableau, Power BI, and SSRS. My expertise extends to ETL processes via SSIS, along with strong database management across MySQL, MongoDB, PostgreSQL, and MS SQL Server. Proven track record in optimizing data processing, developing dashboards, and improving operational efficiency. Holds a Master's in Computer Information Science and an AWS Certified Solutions Architect - Associate certification.

TECHNICAL SKILLS

Programming Languages

Python, SQL, R

Libraries

Pandas, NumPy, Matplotlib, Seaborn, MapReduce

Database

MySQL, MongoDB, PostgreSQL, MS SQL Server

Methodologies

Agile (SCRUM)

Analytical Skills

Data Mining, Cleansing, Statistical Analysis, Visualization, Text Mining, ETL, SSIS/SSRS

Visualization Tools

Tableau, Power BI, MS Excel

Cloud Technologies

AWS (EC2, S3, Lambda, Quicksight), Snowflake

EDUCATION

Master of Science in Computer Information Science 05/2023

Cleveland State University, Ohio

Bachelors in computer Science & Engineering 05/2021

MRCET, India

CERTIFICATIONS

AWS Certified Solutions Architect - Associate (SAA-C03)

PROFESSIONAL EXPERIENCE

Junior Data Engineer Mar 2024- Current

Molina Healthcare, OH, USA

Conceptualized and executed advanced data analysis projects using Python and SQL to extract insights from complex datasets, contributing to strategic decision-making processes.

Extracted and analyzed data from the data warehouse server using complex SQL statements, supporting report building and decision-making processes.

Partnered with healthcare providers to streamline Medicare and Medicaid reimbursement processes.

Managed MySQL databases, optimizing performance and ensuring data integrity for web-based applications and conducted regular backups and maintenance tasks to prevent data loss.

Created dynamic and interactive dashboards and reports in Tableau, employing advanced techniques like context filters, LOD Expressions, and data visualization best practices to deliver insightful analytics.

Integrated AWS Glue to automate data integration and ETL processes, preparing and loading data for analysis.

Utilized NumPy for efficient numerical computing, optimizing array & matrix operations through broadcasting & vectorization.

Implemented data and processed third-party spending data into maneuverable deliverables within a specific format using Excel macros and Python libraries are Matplotlib.

Adopted agile methodology, leading daily scrums and bi-weekly sprint planning and backlog meetings.

Utilized GCP services like BigQuery, Dataflow, and Cloud Storage for data engineering and workflow optimization.

Conducted comprehensive audits & introduced workflow improvements, enhancing efficiency & optimizing reimbursement rates.

Demonstrated proficiency in constructing workflows highlighting the functionality of AWS services including S3, SNS, SQS, and Lambda to achieve efficient system operations.

Designed and deployed AWS Glue jobs for automated data extraction, transformation, and loading (ETL), streamlining data pipelines and ensuring timely availability of insights for stakeholders.

Software Engineer Trainee Sep 2023- Feb 2024

Epsilon, TX, USA

Employed web frameworks like Django to streamline backend development, leveraging pre-built components for routing, authentication, and database interactions, expediting the development process.

Utilized Scikit-learn's pipeline feature to streamline the process of fitting and transforming data.

Spearheaded the development and implementation of a data analytics platform using Python and SQL, resulting in a 15% increase in operational efficiency.

Engineered robust RESTful APIs in Python that incorporate advanced features like token-based authentication and rate limiting, resulting in improved application security and reduced dependencies.

Conducted comprehensive data analysis that cut costs by 10% and boosted revenue by 5%.

Implemented MongoDB for NoSQL database solutions, handling large volumes of unstructured data and enabling flexible data modeling for scalable web applications.

Developed robust ETL processes using SQL Server Integration Services (SSIS), laying the foundation for sophisticated data warehousing and business intelligence solutions.

Utilized AWS Lambda for serverless computing, enabling the execution of code in response to events without provisioning or managing servers.

Collaborated with cross-functional teams to design and develop data models and dashboards in Power BI, enhancing data visualization and reporting capabilities.

Utilized advanced statistical techniques to identify trends and patterns in large datasets, providing actionable insights.

Employed Jupyter Notebook for Python development and data analysis tasks, including writing code, exploring datasets, and creating visualizations.

Data Analyst Oct 2019 - Dec 2021

Dell Technologies, India

Utilized Python-MySQL connector and MySQL dB package to execute a wide range of MySQL database queries from Python, supporting analytical tasks and report generation.

Integrated MongoDB with data pipelines and analytics tools to support real-time data processing and analysis.

Formulated and optimized complex SQL queries with nested subqueries, aggregation functions, and window functions to enhance the efficiency of data retrieval and processing on large datasets, ensuring data quality and consistency.

Prepared reports by using and utilizing MS Excel (VLOOKUP, HLOOKUPS, pivot tables, Macros, data points) in report preparation tasks.

Designed and deployed interactive reports and dashboards using Power BI to visualize key performance indicators (KPIs) and metrics for various business units.

Integrated Power BI with SQL Server, Excel, and cloud services for real-time analytics.

Employed DAX (Data Analysis Expressions) to create complex measures and calculated columns in Power BI.

Utilized Pandas and NumPy for efficient data manipulation, ensuring accuracy and consistency.

Applied data pre-processing using Scikit-Learn, including imputation for missing values, scaling, logarithmic transformation, and one-hot encoding.

Utilized Azure Data Factory for ETL, automating data integration and transformation.

Leveraged Azure Blob Storage for scalable and cost-effective object storage, storing large volumes of unstructured data.

Conducted statistical analysis on datasets using various statistical methods and tools to derive meaningful insights and make data-driven decisions.

Implemented text mining techniques to extract valuable insights from unstructured text data, including sentiment analysis, topic modeling, and entity recognition.

Applied Waterfall methodology for projects with clear requirements, ensuring a linear development process.

ACADEMIC PROJECTS

Data Engineer Multidimensional OLAP Cube using Adventure Works Data Warehouse

Built a Multidimensional OLAP Cube in Microsoft SQL Server Analysis Services (SSAS) using Visual Studio SSDT for efficient querying and analysis of data.

Analyzed Adventure Works database sales data by product, geography, and promotion type.

Used MDX queries to extract meaningful insights from the cube.

Building an ETL Pipeline for Analyzing Public Weather Data using AWS

Designed and develop the ETL pipeline using AWS Glue crawlers, transformers, and jobs.

Configured data extraction logic to automatically discover and schedule retrieval of new weather data files from the chosen public source (e.g., NOAA NCEI).

Configured Glue jobs to orchestrate the entire ETL process, including scheduling, data extraction, transformation, and loading of the transformed weather data into Amazon Redshift tables.

Engineering Assistant Utilities Data Management

Responsible for implementing a streamlined process for invoice stamping, reducing manual effort by 70%. Received commendation for exceptional organizational skills and attention to detail. Improved data management practices, leading to increased efficiency in storing, accessing, and retrieving information.



Contact this candidate