Post Job Free

Resume

Sign in

Data Analyst Processing

Location:
St. Louis, MO
Posted:
January 22, 2024

Contact this candidate

Resume:

Summary

Seasoned Data Analyst with over * years of hands-on experience specializing in data

interpretation, statistical analysis, and deriving actionable insights from complex datasets. Proven track record in leveraging analytical skills to enhance decision- making processes. Proficient in data visualization tools, database management, and adept at uncovering valuable patterns within data. Demonstrated achievements include a 15% increase in reporting accuracy and a 20% improvement in data processing efficiency. Strong communication and collaboration skills contribute to a cohesive and data-driven work environment.

Professional Experience

Fabletics – AWS Data Analyst

El Segunda, CA

06/2023 - Present

Led the adoption of Amazon EMR (Elastic MapReduce) for big data processing, resulting in a 30% improvement in data processing speed and efficiency.

Implemented real-time data processing with Amazon Kinesis, resulting in a 20% reduction in data latency.

Implemented RESTful APIs with AWS Lambda to facilitate seamless data interaction and integration, complementing Apache Airflow's ETL pipeline orchestration for optimized data workflows, including PySpark transformations, resulting in a 20% enhancement in data workflow efficiency through optimized API interactions.

Designed and implemented data workflows in AWS Glue, improving data integration efficiency by 20%.

Enhanced data storage and retrieval, reducing latency by 20% through efficient use of Amazon S3.

Optimized Amazon RDS (Relational Database Service) performance through query tuning and indexing, reducing query response time by 25%.

Implemented data warehouse solutions using Amazon Redshift, achieving a 20% improvement in data processing speed.

Leveraged Python for data manipulation, achieving a 25% improvement in coding efficiency.

Integrated machine learning algorithms into data engineering pipelines, enabling predictive analytics and automated data-driven decision support using AWS Sage Maker.

Created impactful Tableau Data Visualizations, enhancing stakeholder comprehension by 20%.

Utilized sci-kit-learn, a popular Python machine-learning library, for tasks like model training, evaluation, and hyperparameter tuning.

Managed version control with Git and project tracking with Jira for efficient data engineering collaboration.

Graduate Assistant at SIUE- Data Analyst

Edwardsville,IL

05-2022 - 05/2023

Implemented Index and Match functions in Excel, improving data retrieval speed by 40% for enhanced responsiveness in data analysis.

Optimized catalog maintenance efficiency by 20% through strategic implementation of filters and sorting in Excel, streamlining updates and data maintenance processes.

Automated data extraction with Python scripts, leveraging Pandas and NumPy, reducing manual effort by 25% and enhancing data processing speed.

Reduced data manipulation time by 30% through efficient SAS data step programming.

Increased data cleanliness by 25% through the systematic use of Statistical Analysis functions and procedures.

Implemented data normalization techniques (1NF, 2NF, 3NF) and de- normalization strategies leveraged SQL and Py Spark to optimize data querying, improving data performance by 40%.

I

ad2z3t@r.postjobfree.com

469-***-****

Irving, TX-75038

www.linkedin.com/in/rahul-akula-1116

Education

05/2023

Southern Illinois University Edwardsville

Edwardsville

Master of Science: Information Systems

3.3/4.0 GPA

Honors & Acknowledgements : Dean's Honors

List, Engineering Dean's Master Scholarship,

master's Competitive Scholarship

05/2020

Jawaharlal Nehru Technological University

Hyderabad, India

Bachelors : Information Technology

3.54/4.0 GPA

Gold Medal Recipient

Published paper in International Journal (IJCRT) https://ijcrt.org/papers/IJCRT2002166.pdf

Skills

Programming languages : Python (NumPy,

Scikit-learn, Pandas, TensorFlow, Keras,

PyTorch), Java, LINUX, C, HTML, CSS,

JavaScript

Database Tools : MS Access, MySQL, SQL-

Server, PostgreSQL, MongoDB.

Visualization Tools : Tableau, Power BI,

Python (Matplotlib, Seaborn, Plotly), MS Excel

Environments : GitHub, Google Colab,

Anaconda, GCP, Kubernetes, Snowflake, VS

Code, Jupyter Notebook, RStudio, Microsoft

Azure, Databricks.

Key Concepts : Machine Learning, CI/CD-

Systems, Artificial Intelligence, NLP, Statistics, Azure Services, Py Spark, ETL Tools, Operating

systems, Apache Spark, Scala, Data Models, Big

Query, Apache Maven, Scripting, Datasets,

Algorithms, Predictive Analytics, Database

Management, Big Data Analytics, JIRA Agile

tool, Six Sigma.

Certifications

• Career Essentials in Data Analysis

(Microsoft)

• AWS Certified Cloud Practitioner

• KPMG Data Analytics Virtual Internship

• Microsoft Power BI Internship (PwC)

• Data Engineering with AWS (LinkedIn)

• Programming, Data structures and

Algorithms using Python (IIT MADRAS)

• Python for Data Science (IBM)

• Advanced SQL for Data Science

RAHUL AKULA

Created dynamic PowerBI Dashboards, resulting in a 30% improvement in data accessibility and user engagement.

Implemented Key Performance Indicator (KPI) visualizations in PowerBI, resulting in a 20% improvement in tracking and understanding critical metrics. Tata Consultancy Services- Financial Analyst

Hyderabad, TS

12-2018 - 12/2021

Implemented advanced data analysis techniques, optimizing Python and SQL methodologies for efficient data acquisition.

Engineered and deployed sophisticated fraud detection algorithms, resulting in a 15% increase in accuracy and strengthened risk management.

Developed and executed predictive models utilizing machine learning, reducing financial losses by 12% through improved issue resolution.

Executed intricate SQL queries to extract and integrate relevant data, improving data retrieval efficiency by 20% for faster claims insights.

Employed Excel's data cleaning functionalities, reducing errors by 15% and ensuring a more reliable foundation for analysis.

Conducted statistical exploratory data analysis, identifying key patterns and outliers, leading to an 8% improvement in claims processing efficiency.

Created interactive PowerBI dashboards, improving project comprehension and decision-making across the organization.

Implemented Six Sigma methodologies to streamline claims verification processes, resulting in a 30% reduction in processing time. Key Projects

Bridge Crack detection and segmentation using UAV images. Developed deep learning approach using Python, TensorFlow, U-Net architecture, and NVIDIA GeForce GTX 1080 Ti GPU to detect and segment cracks in infrastructure surfaces.

Spotify Music Data Analysis Pipeline

Created an ETL pipeline for Spotify music data using the Spotify Web API to extract, transform, and load data into a warehouse. Performed thorough music trend analysis and visualization to inform content recommendations and user engagement strategies. Oil pipeline leakage detection in Petroleum Industry Conducted data gathering, preprocessing, and model training using PCA, Label encoding, and Normalization. Employed various machine learning models, including Random Forest, SVM, Decision Trees and Logistic Regression, to optimize Accuracy Score, Execution Time, and F1 score metrics.



Contact this candidate