PROFESSIONAL SUMMARY:
Around * years of experience as a Professional Qualified Data Scientist/Data Analyst/ Tableau Consultant including Machine Learning, Deep Learning, Data Mining, Statistical Analysis and Predictive model building. Worked end-to-end data science project life cycle including data extraction, data cleaning, statistical modelling, and data visualization with large data sets of structured and unstructured data.
Around 6 years of Professional Qualified Data Scientist including Machine Learning, Data Mining, and Statistical Analysis with Artificial Intelligence solutions.
Expert-level knowledge in learning algorithms such as inverse reinforcement learning, deep reinforcement learning and probabilistic inference for decision support system and creating predictive models.
Work as an independent yet integral member of a team to develop innovative and creative AI / ML solutions for highly automated vehicles with world leading experts of autonomous vehicles dealing with the telematics data and video.
Skilled in Advanced Regression Modelling, Correlation, Multivariate Analysis, Model Building, Business Intelligence tools and application of Statistical Concepts.
Worked with complex applications such as, MATLAB, and SPSS to develop a neural network, cluster analysis.
Extensive experience in Text Analytics, developing different Statistical Machine learning, Data Mining solutions to various business problems and generating data visualizations using R and Python.
Shown great level of awareness in performing the Transfer learning.
Highly knowledgeable in implementing row level security ensuring the Tableau infrastructure conforms to security standards and that appropriate access controls like permission and authentication are in place.
Versatile in analyzing complex data sets to develop meaningful, business driven data visuals utilizing Tableau or other data visualization tools to meet users requirements.
Extensive experience using T-SQL/PL-SQL for developing complex and well-tested Stored Procedures, Triggers, Tables, Views, User defined Functions, Relational Database Models and Data Integrity, SQL Joins, and Indexing
Experienced in developing and maintaining a formal description of the data and data structures including the data models (star schema, snowflake schema and dimensional modelling), data flow diagrams, data dictionaries and technical metadata
Knowledgeable in Full Life Cycle Development of Reporting Projects includes Requirements Gathering/Analysis, Design, Development, Testing and Production rollover
Automated recurring reports using SQL and Python and visualized them on BI platform like Tableau by creating lines and scatterplots, Bar-charts, Histograms, Pie-chart, Dot-charts, Boxplots, Timeseries, Error Bars, Multiple Charts types, Multiple Axes, subplots etc.
Performed A/B testing including Hypothesis testing and applied eCommerce and B2B solutions and solved visitors pain points.
Updated python scripts to match training data with our database stored in AWS Cloud Search. So that we would be able to assign each documents a response label for further classification.
Knowledge of C/C++ including thread synchronization multithreading, multi-processing, concurrency and TCP/IP Socket Programming.
oPassionate about gleaning insightful information from massive data assets and developing a culture of sound, data-driven decision making proficient in tuning and performance improvement of Tableau dashboard, new dashboard development, troubleshooting the existing dashboards and performance management
Taking responsibility for technical problem solving, creatively meeting product objectives and developing best practices.
Excellent communication skills (verbal and written) to communicate with clients and team prepare plus deliver effective presentations.
Ability to maintain a fun, casual, professional, and productive team atmosphere.
EDUCATION
IASE Deemed University Mar 2004-June 2005
Master of Science – Computer Science Rajasthan, India
OSMANIA University Mar 2001-Aug 2002
PGDMISCA Hyderabad, Telangana, India
OSMANIA University Aug 1997-June 2000
Bachelor of Science – Math/Statistics/Computer Science Hyderabad, Telangana, India
Technical Skills:
Tableau Desktop 7/8/9/10/2018/2019/2020/2021
Tableau Server/Online 7/8/9/10/2018/2019/2020/2021
Microsoft SQL Server MS SQL Server Integration Services (SSIS),
MS SQL Server Reporting Services (SSRS)
MS SQL Server Analysis Services (SSAS)
ETL – Alteryx, SSIS, SQL Stored Procedures, and Python
T-SQL PL-SQL, OLAP, OLTP
Erwin Data Modeler Microsoft Visual Studio,Microsoft Access, Microsoft Office(MS Word/ Excel/Powerpoint), MS Project
Programming: JavaScript, HTML, CSS, XML
PROFESSIONAL EXPERIENCE:
Client: Vertex March 2021 – Present
Role: Tableau developer/Big data
Responsibilities:
Deeply involved in meeting with business stakeholders and SMEs during requirements gathering and during different phases of the projects.
Created dashboards/reports in Tableau to help answer questions surrounding KPIs and also help identify trends in data
Expertly employed various workarounds to meeting functional and non-functional requirements
Collaborated with business users to analyze requirements, determine scope of projects, and address concerns of business questions
Effectively migrated dashboards from Power BI, MicroStrategy, and Cognos into Tableau
Expertly gathered and documented business requirements in a BRD, conducted analysis and recommended solution options
Structured the dashboard with a consistent layout with visuals chart at the top and corresponding crosstab data at the bottom
Professionally built dashboards with floating objects and capitalizing the various dashboard actions features like URL, images and web integrations
Conceived visualizations which included dashboards, flowcharts, & graphs to relay business concepts to the stakeholders
Performed various data analytics in SQL and MS Excel by deploying statistical models or industry-accepted tools
Carried out a wide range of performance tuning processes in SQL and tableau to make dashboard visualization faster
Created and maintained SQL queries, indexes and complex queries for data analysis and extraction
Client: AT&T, Middletown, NJ Sep 2019 – March 2021
Role: Data Scientist/Machine learning
Responsibilities:
Build various Recommendation engines, predictive models, real-time analytics and Price Optimization.
Data is mostly from a log entry for each ticket with respect to a problem where Filtered out unwanted characters from a text ticketing data.
Used NLP techniques, NLTK, Regular expressions, Keras and Tensorflow for preprocessing of the text.
Built and applied LSTM after cleaning the data.
Analyzed the performance with fine tuning techniques. Compared the LSTM Model with RNN, GRU, Word embeddings.
Performed Data Cleaning, features scaling, features engineering using pandas and NumPy packages in python. Ensure that the model has low False Positive Rate and Text classification and sentiment analysis for unstructured and semi-structured data.
Designed, developed and tested applications for text processing, such as name or entity matching, text categorization/routing, named-entity extraction, sentiment analysis.
Applied artificial intelligence applications like virtual agent, RPA, text analytics.
Developed progress-focused based statistics and quantitative editing abilities and gave long-term standing initiatives and dependability devised strategics to data operations.
Approach: Worked on employee data and applied Decision tree algorithm where Considering the whole training set as the root. Build various Recommendation engines, predictive models, real-time analytics and Price Optimization using Machine learning and Deep learning techniques.
Environment: Python, R, Machine learning, Deep learning, RNN, NLP, Tableau, ETL, Spark, Tensor flow, SQL.
Client: Automatic Data Processing (ADP.LLC), Parsippany, NJ May 2019 – Aug 2019
Role: Data Scientist
Description: ADP Global view Payroll System is an end-to end global payroll solution that combines the most advanced technology with outstanding service to give high visibility and control of distributed payroll and HR operations in every country.
Responsibilities:
Managed with portfolio risk modeling and diversified system of mutual funds that are grouped together to provide an expected return with a corresponding amount of risk.
Performed the risk at funds which are invested in the model portfolio of payroll and taken the future contributions which are invested in the model portfolio from what we have chosen.
Applied distant supervision, CNN model and attention mechanism to obtain relation label for each company .
Performed Information Extraction using NLP algorithms coupled with Deep learning (ANN and CNN), Keras and TensorFlow.
Data mining, Segmentation analysis, business forecasting and association rule mining using large data sets with Machine learning.
Supported client by developing Machine Learning Algorithms on Big Data using PySpark to analyze transaction fraud, Cluster Analysis etc.
Approach: Build a model to accomplish the Named entity recognition by using NLP that consisted of multilayer NN to predict various categories likely name, Organization, location, and Miscellaneous fields. Analyzed the positive and Negative sentiment for each sentence given and Build tree based RNN language model to analyze the attitude from reviews to find out the like and dislike of the custom.
Environment: Python, R, TensorFlow, Machine learning Algorithms, Deep learning, ETL, NLP, Spark, Tableau, AWS, E-R studio, MS Excel.
Client: Nationwide Mutual Insurance, Phoenix, AZ April 2017 – May 2019
Role: Data Scientist
Description: Nationwide Mutual Insurance Company and affiliated companies is a group of large U.S. insurance and financial services companies based in Columbus, OH.
Responsibilities:
Utilized Spark, MLLib, Pytorch, a broad variety of machine learning methods including classifications, regressions, dimensionally reduction, Clustering to identify volume using Ski-kit learn packages of python etc. and Utilized the engine to increase user lifetime by 45% and triple user conversations for target categories.
Performed Data Profiling to learn about behavior with various features such as traffic pattern, location, and time, Date and Time etc.
Analyzed traffic patterns by calculating autocorrelation with different time lags.
Communicated the results with operations team for taking best decisions.
Approach: Provided Software solution by building a predictive model on the cancer diagnosing data and gave front end application. Later programmed with Software-based B2C companies and split into both Service-focused software based B2B and Product-Focused software based B2C models.
Environment: Python, Machine learning, NLP, HDFS, SQL, AWS, Spark, ETL, and Tableau Desktop.
Client: DXC Technology, Chennai, India Oct 2009– Jan 2011
Role: SQL DEVELOPER
Description: Responsible for verifying and implementing the detailed technical design solution to the problem as identified by the Project/Technical Manager.
Responsibilities:
Created executive dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts and publishing them to tableau server
Developed complex mappings and workflows in accordance with Business requirements
Employed strong SQL skills for analysis, validation, and verification of data.
Successfully worked with multiple sources, performed various LOD calculations, new dashboard development, troubleshooting the existing dashboards and performance management
Developed SQL Queries to retrieve data from SQL Server Databases
Provided training and support to other dashboards developers
Successfully improved Performance & tuning of Data extracts for Tableau and SQL scripts used in extracts