Post Job Free

Resume

Sign in

Data Python

Location:
West Lafayette, IN
Posted:
September 19, 2020

Contact this candidate

Resume:

SAI TARUN REDROUTHU

adf738@r.postjobfree.com github.com/saitarun95 +1 (314) 384- 2577 Lafayette, Indiana, 47906 SKILLS:

Languages: Python (Scripting, Pandas, NumPy, SciPy), SQL, R, HTML, CSS, Java, JavaScript, C, C++ Platforms and tools: AWS Cloud, Apache Spark, Map-Reduce, Tableau, QuickSight, PowerBI, Jupyter Notebook, PostgreSQL, HIve Machine Learning: Regression, Decision Tree, KNN, SVM, Clustering (Hierarchical, K-Mean, DBSCAN), Recommendation Systems EDUCATION:

University of Illinois at Springfield - Master of Science in Computer Science Aug 2018 – Dec 2019 Jawaharlal Nehru Technological University - Bachelor’s in Information technology Aug 2013 – May 2017 WORK EXPERIENCE:

Apex IT Systems – Lafayette, Indiana April 2020 - Present Data Analyst Engineer (AWS, Python, SQL, R, Tableau)

• Data Integration using AWS Redshift SQL and Amazon S3 data store, Athena, AWS Glue catalog into Data Warehouse (ETL)

• Created ETL data pipelines in Python to integrate data from AWS cloud APIs, XML/Excel, SQL server to perform insert/update/delete operations to the database after data processing, saving up to 10 hours a week.

• Analyzing data from third-party sources and determining strategies to ingest and enrich existing data base. (SQL, Data Modeling, Data Warehousing, ETL)

• Focused on reducing the number of cases raised by associates on a daily basis by creating AWS Quicksight dashboard focused on reducing the operational cost by 3%.

• Delivering insights through designing reports, dashboards and scorecards that drive operational efficiency, quality of decisions and revenue opportunities (Python, SQL, Tableau)

• Created complex, custom queries for large data sets using SQL and used R to manage/automate the data. Illinois Emergency Management Agency – Springfield, Illinois Feb 2019 – Dec 2019 Data Engineer (AWS, Python, SQL, R, Tableau)

• Performed load operations to migrate data from MS Excel to cloud MySQL database which bumped up the result generation process by 46 % improving the overall performance of the project and speed up project timeline (ETL).

• Built automated data pipelines for Data Extraction, Transformation and Loading (ETL) using Python and SQL scripting to migrate data from one subsystem to another.

• Performed Data Analysis such as data wrangling, refinement and data visualization of the dataset in Jupyter notebook using python which contains house sale prices for counties in Illinois sold between 2014 to 2018 containing over 2 million records for insights.

• Developed a python data stream pipeline from S3 to data warehouse.

• Programmed complex SQL queries utilizing multiple joins, nested queries, analytical functions to extract, manipulate and analyze data based on project requirements. (SQL)

• Built KPI dashboard using Tableau to track the houses of all the counties in Illinois affected by Radon Gases. Presented results to senior leadership for review, accelerating the business decisions by 33%.

• Developed advanced dashboards in R studio (ggplot2) using R queries in order to investigate and conducts studies on forecast. Amazon - India Jul 2017 – Aug 2018

Data Associate (Excel, SQL, Tableau)

• Worked alongside product managers to prepare and analyze density charts using advanced Excel like VLOOKUP and Pivot to present the next product launch numbers.

• Worked on large volumes of data with over 5 million records, extracting the required data and removing the unwanted data using SQL queries.

• Created Tableau dashboards for daily and monthly reports summarizing Alexa’s performance. (Scatter plots, Pie-Charts, Bar Graph) PROJECTS:

• Loan Payment Prediction (Machine Learning) - Built a classifier model to predict whether a loan case will be paid off or not, cleaned, normalized the data and applied KNN, Decision Tree, SVM, Logistic Regression resulting in the accuracies of each classifier. (Python)

• International Migration: Explored the datasets with pandas, performed data wrangling and filtering based on the criteria and visualized data using Matplotlib libraries in Jupyter Notebook for the people who migrated to Canada from 1980 to 2013 from various countries and compared the trend of top 5 countries. (Python)



Contact this candidate