Tampa firstname.lastname@example.org 813-***-**** LinkedIn GitHub
• Looking for full-time opportunities in Data Science, Data Analytics, Data Engineering and Business Analytics Domain.
• 4 years of experience in Data Analytics, SQL and Networking Domain. EDUCATION
University of South Florida, Tampa, FL May 2020
Master of Science in Business Analytics and Information Systems University of Mumbai, Mumbai, India June 2014
Bachelor of Engineering, Computer Science
Programming Languages: Python, R, C#, Java, Hadoop, Spark, Cassandra, PHP, WordPress Software Tools: Secure CRT, Putty, Salesforce, GitHub, VB, Excel, GIS(ArcGIS) Visualization tools: Tableau, Power BI
Database: PostgreSQL, Oracle, MongoDB, PL/SQL, Hive, Impala, SQL Developer WORK EXPERIENCE
Career source Tampa Bay, Tampa, FL Aug 2019-May 2020 Teaching Assistant
• Developed courses for students spanning in Data Analytics, Python, Excel by looking into current technologies and their business implications.
• Assisted the students in case any doubts following the same and graded their assignments through personal meet-up and skype.
• Performed global outreach of the courses through social media marketing, and advertising in social events. Infosys Ltd., Pune India Feb 2017-May 2018
• Used PostgreSQL to derive results about clients and produced statistical reports with respect to each one of them and drew conclusions about them.
• Communicated with the Program Manager and acted as an intermediary between them and the development team.
• Developed, implemented and maintained enterprise business information systems. Tata Communications Ltd., Pune India Feb 2015- Feb 2017 Network Analyst
• Troubleshoot various network issues like DNS unreachability, provided solutions for them, helped the team resolve issues with 20% more efficiency.
• Worked with routers mostly, also performed SQL operations whenever needed, to draw Excel files keeping tabs on client expenditure.
• Work on support tickets, maintenance tasks and assigned projects at discretion of the Reporting Coordinator. PROJECTS
Data Science Programming (Retinal Image Classification of Diabetics to detect need of surgery)
• Analyzed a Kaggle dataset to check the given diabetic people need retinopathy. The dataset consisted of around 4K train and 2K test images.
• Used Python to extract images into arrays, and after sorting applied algorithms like Gaussian NB, SVC and Decision Tree classifier to see which model works better.
• Implemented hyper parameter tuning to acquire best results via SVC Classifier(65%). Data Visualization (HART & Bull Runner usage over the years) CUTR (Center for Urban Transport and Research)
• Performed data visualization of an on-campus survey data and solved important transport queries using Power BI to help the on-campus transport & research organization address transport issues in the university.
• Made use of several powerful Power BI charts like Funnel charts, Donut charts & Bar/Column charts to derive results. Independent Research (Removing Biasness in data using COMPAS algorithm)
• Analyzed an IBM dataset called the COMPAS, which is based on the FAIRML algorithm used to analyze recidivism for different kinds of people.
• Worked on an algorithm named the Disparate Impact remover and used Logistic Regression to model and derive the results.
• Produced an accuracy of 85% odd in successful mitigation of the biasness in the data.
• Made use of python packages like sklearn, matplotlib, numpy and pandas to achieve the same. Data Analytics (Auto insurance)
• Formulated regression models using R to understand how the insurance companies think when providing the customers with lifetime values such as factors like the type of vehicle, its cost, etc.
• Analyzed the models using hypotheses testing using ANOVA, Chi-Square, T-test and drew inferences indicating the best possible model in the lot, to be used for prediction. Used libraries such as ggplot, randomforest, dplyr for analysis.
• Applied machine learning algorithms (OLS, Logit) to the models and provided results with 90% accuracy. Data Visualization (Global Drugs consumption)
• Analyzed an original UN dataset and used it to draw charts in Tableau, to depict the drug workflow in the world.
• Made use of the time-series graph, apart from few others like maps, pie charts, to draw conclusions with the dataset.
• Followed the ETL process of Formatting columns through pandas, reproducing the Excel sheet and pushing into the tableau servers to get the results.