Post Job Free
Sign in

Information Technology Data Analytics

Location:
Katy, TX
Posted:
November 12, 2024

Contact this candidate

Resume:

VARIJA YANAMADALA

******************@*****.*** 623-***-**** https://www.linkedin.com/in/vy05/

PROFESSIONAL SUMMARY

Analytical Engineer and Research Developer with 5 years of experience in data workflow optimization and scalable platform development. Completed a master’s in information technology, specializing in data tools and technologies. EDUCATION

Arizona State University, Ira A.Fulton School of Engineering Master’s in Information Technology(IT) Mesa, AZ Concentrations: Data Analytics, Cloud Technologies, Big Data, Data Science GPA: 4.0 / 4.0

TECHNICAL SKILLS

Programming Languages : Python, Scala, C#, C, Java. Python Libraries : Numpy, Pandas, Matlplotlib, Keras, Seaborn Cloud:AWS, Google Cloud Platform, Snowflake . Data: SQL, PostgreSQL, MySQL, MongoDB, Databricks, Apache Spark. Vision Libraries: OpenCV, TensorFlow. Developer Tools: Git, Jupyter Notebook, Spyder, Visual Studio, AWS, GCP. Data Governance & Quality: Experienced in establishing data quality checks, governance standards, and data validation. Visualization Tools : MSExcel,Tableau, PowerBI, Alteryx, Quicksight. PROFESSIONAL EXPERIENCE

Research developer – Arizona state university, Tempe August 2024 – Present

• Spearheaded the development of key features for a streaming platform using Python and React, implementing OTP-based account migration and login activity logs to enhance security and reliability.

• Engineered data pipelines using PySpark and Airflow, targeting a 25% improvement in data processing efficiency as the project scales.

• Established robust data protection measures by applying CI/CD checks, IAM roles, and AWS KMS, laying a foundation for scalable, secure platform growth in preparation for industry presentation. Analytical Engineer –Elevance Health, Indianapolis, USA May 2023 – May 2024

• Elevated demand planning accuracy by 20% and enhanced data integrity through real-time forecasting and automated anomaly detection, leveraging machine learning tools like Python, TensorFlow, and Databricks.

• Optimized data pipelines handling over 5TB daily using Apache Spark and Databricks, reducing query response times by 30% and securing reliable data delivery for stakeholders.

• Facilitated migration to Google Cloud Platform (GCP) and integrated Tableau dashboards with APIs, achieving a 20% boost in query speed, a 15% reduction in storage costs, and improved data visibility, enabling informed, timely decisions. Business Analyst – Jeevamrut Foods, India June 2020– July 2022

• Automated core business processes with Python (Pandas, Selenium), reducing manual work by 30% and enhancing operational efficiency.

• Enhanced backend performance through SQL optimization, achieving a 20% increase in system speed and a 25% boost in customer orders.

• Developed RESTful APIs and automated ETL workflows using Flask and Apache Airflow, improving platform integration and reducing food waste by 10%.

Data Analyst – Innomatics, India June 2019 – June 2020

• Streamlined data processing workflows with SQL, Python, and Tableau, increasing speed by 25% and reducing retrieval times by 30% through a centralized Snowflake data warehouse, improving operational efficiency and reporting precision.

• Enhanced demand forecasting accuracy through data analysis, enabling data-driven decision-making in supply chain processes, contributing to business growth.

PROJECTS

Heart Disease Prediction using Python and Scikit learn May 2024 – June 2024

• Designed a machine learning model for heart disease prediction with 95% accuracy, providing actionable healthcare insights. Utilized EDA and multiple algorithms (SVM, RF, KNN, DT), along with advanced data preprocessing techniques. Uber Data Analytics and ETL pipeline development using GCP and Big Query May 2023 – April 2023

• Utilized GCP technologies and Python to analyze Uber TLC Trip Record Data, employing tools such as GCP Storage, Compute Instance, BigQuery, Looker Studio, and Mage Data Pipeline Tool to enhance operational efficiencies.

• Optimized route planning and improved service delivery by generating deeper insights into pickup/drop-off patterns, fare structures, and passenger demands.



Contact this candidate