Subramanian Shanmugavel Last Updated on **th January 2020
Master of Science in Data Science
Dec 2017 Bloomington, IN
Master in Information Technology
Apr 2011 TamilNad, India
B.Tech in Information Technology
May 2009 TamilNad, India
Applied Machine Learning
Bayesian Data Analysis
Exploratory Data Analysis
Statistics in Data Science
AWS/Hadoop in Big Data
Advanced Database Systems
Python • SAS • R • SQL• Teradata SQL
Oracle (PL/SQL)• MySQL• Mongo Db•
Postgre SQL• Snowﬂake• SQL
Spark•Databricks• Hive• MS Excel •
Tableau • Google Analytics • AWS
Hypothesis Testing • ANOVA • T-Test•
Machine Learning•Logistic Regression•
Random Forest• Support Vector
Machine• KNN• Naive Bayes• PCA
Data Science Libraries:
Scikit-learn•pandas• NLTK• tensorﬂow•
seaborn• ggplot2• plotly•plyr•ggplot
Tableau Training For Data Science
Python and SQL for Business Analysis
Statistics for Data Science
Hands-On Hadoop - Tame your Big Data!
SENIOR DATA ANALYST CAPITAL ONE
Sept 2018 – Present Virginia
• Designed and implemented Daily Dashboard on Sales and Acquisitions of Walmart Credit cards for Capital one and Walmart ﬁscal calendar view across all commercial LOB using Tableau
• Gathered metrics and Data Mapping on TSYS ﬁelds in order to produce aggregated reports for acquisitions, Transaction, rewards and store level for executive level presentations.
• Data migration using Data bricks and Catapult data from Oracle, S3, Teradata and Redshift into the target Snowﬂake Warehouse
• Includes expertise in data quality, data organization, metadata, and proﬁling.
• Responsible for the design, development and administration of transactional and analytical data structures.
• Data set sizes are usually huge (in excess of hundreds of millions of records).
• Created reports and dashboards in Tableau and published them to the server.
• Environment: Snowﬂake SQL,R,Teradata, SAS, Data bricks, Python, Tableau, Redshift, BTEQ scripts
SENIOR DATA ANALYST INSTRUCTURE (UITS)
Aug 2016 – Feb 2018 Bloomington
• Insights into student performance to improve retention using SQL and Tableau
• Demonstrated ability to move data between production systems and across multiple platforms.
• With Denodo, has successfully integrated data sources such as Oracle, MS-SQL, Amazon Redshift, Web services and Box.com.
• Included within those responsibilities are the areas of data access and delivery technologies.
• Aided in website optimization by performing A/B Testing of the page variants.
• Environment: SQL, Python,R, Tableau, Hive, Spark, Denodo, Redshift. DATA ANALYST DYNAMIC SOLUTIONS
Sept 2014 – Apr 2016 Tamilnad, India
Applied Data Finance
• Job Description: The objective of the project was to build an algorithm to improve bank’s efﬁciency by reducing default rate while offering.
• Implemented advanced analytical functions Lead/Lag, Over, Partition By, Dense Rank, Row Number, Windowing Aggregate Function, First/Last Functions.
• Data for modeling was collected using SQL by querying several tables. The extracted tables were further appended or merged to create tables for modeling using R studio.
• Presented work progresses for each stage of the report development.
• Environment: SQL, Python, Tableau, Sckit-learn, Matplotlib, Pandas, Seaborn, SSIS, SQL Server, Statistics, AWS, Hadoop
• An Adaptive approach of Tamil Character recognition using Deep Learning
• Improving Restaurants on Yelp by Sentiment Analysis of Reviews
• MEDLINE clustering Analysis using R
• AirBnB and Zillow Data Analysis using R and Tableau
• Analyze IMDB score with data mining algorithms using Python