•
Nalliboyina Sai Srinivasa Yadav
Data Analyst
Contact:
saisrinivasayadav
@gmail.com
OBJECTIVE: Diligent data analyst with 6+ extensive knowledge of SPSS, SQL, Python, R, and Tableau. Facilitate business growth through data management optimization and SQL queries. Proven record of cost reduction, sales increases, and successful product launches. Collaborative leader adept at preparing and presenting findings to executives and key stakeholders. PROFESSIONAL SUMMARY:
• 6+ years in data analysis, visualization and Skilled in developing and implementing data collection systems, identifying business needs, and creating predictive models that improve forecasting accuracy.
• Google Looker Studio is tool that turns your data into informative, easy to read, easy to share, and fully customizable dashboards and reports
• Combined Python and Tableau to create and modified Tableau worksheets and dashboards by performing Table level calculations.
• Experience in conducting Joint Application Development JAD sessions for requirements gathering, analysis, design and Rapid Application Development RAD sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project's exposure to the forces of change.
• Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load and MultiLoad to export and load data to/from different source systems including flat files and Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
• Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Experience in coding SQL/PL, SQL using Procedures, Triggers and Packages.
• Strong Knowledge on Power BI to import data from various sources such as SQL Server, Azure SQL DB, SQL Server Analysis Services (Tabular Model), MS Excel etc. Implemented Optimization techniques for better performance on the ETL side and also on the database side.
• Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
• Experience in designing report with rich UI including Tabular forms, Matrix (Cross /tab Report) form, Drill down, Charts and Sub Reports using Power BI
• Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI.
• Developed SSIS Packages to Extract, Transform and Load (ETL) data into the Data warehouse from SQL Server.
• Built Help desk Chatbots using Google DialogFlow and Created messenger Chatbots for clients. Created chatbots using Google DialogFlow . TECHNICAL SKILLS:
• Big Data:
Hadoop, Hive, HDFS, Kafka
• BI Tools:
Power BI (2.x), Tableau, domo,
Sisense.
• Methodologies:
Waterfall, Agile-Scrum,
• Business Management Tools: MS
Visio, UML, Rational Requisite Pro,
RTM, RM, Team Forge
• Other Tools: SQL Navigator, TOAD,
T-SQL, Informatics Power Centre,
Denodo.
• Process/Technologies:
Quick Test Pro, AEM, JIRA,
RALLY, Azure Board, TFS,
Version One, HP ALM, HP AGM,
API'S, JAVA, MS Visio,
PeopleSoft, MS Dynamics,
DB Visualize Pro, TOAD Data Point.
• Reporting Tools:
Tableau, Alteryx, Adobe,
Analytics, Qlik Sense.
• Google Apps Scripts:
Automation of Google Sheets
Documents, other applications
• Databases:
MS Access, SQL, Oracle, DB2,
Teradata, Big Data, MySQL.
• Operating Systems:
Windows 95/98/2000/XP
• Software Suites:
Microsoft Office
EDUCATION:
• Masters from Lindsey Wilson College Virginia, USA PROFESSIONAL EXPERIENCE:
Client: Capital One Financial Corporation, Tyson, Virginia, USA Sep 2023 - Present Role: Senior Data Analyst
Description: Capital One is a financial services company specializing in credit cards, banking, auto loans, and savings accounts. Played a key role in developing advanced statistical and machine learning models, managing data collections, and creating insightful reports and visualizations to drive decision-making and enhance client outcomes. Responsibilities:
• Coordinate data collection with external partners and internally manage datasets and Proficiently use statistical programming software such as STATA, Python, SPSS, R, SAS (STATA preferred)
• To create statistical, regression, and econometric models and Generate statistical models and advanced code for conducting student outcome analyses. Create, restructure, and manage large datasets, files, and systems.
• Conduct qualitative and quantitative research and use mixed-method approaches to data analysis. § Write technical memos that document data processing decisions and summarize the quality of data.
• Analyze and synthesize research findings into key insights for stakeholders and Create research papers and policy briefs and Assist in the design, implementation, and enforcement of data infrastructure, security and system policies
• Applied advanced classifying algorithms such as Decision Tree, Random Forests, SVM and Gradient Boosting to training data using Scikit-learn and NLTK packages in Python.
• Created Docker images base on Docker files and pushed to Docker Hub repositories.
• Create data cleanup routines using Google App Script, Google Sheets Advance Formulas, for use in Google Data Studio and Wrote custom Google Sheet Utilities to send HTML formatted email based on sheet data.
• Working on Power BI to import data from various sources such as SQL Server, Azure SQL DB, SQL Server Analysis Services (Tabular Model), MS Excel etc.
• Analyzed client's customer data to uncover useful patterns for prediction and Coded machine learning solution with PythonAnaconda to predict Customer behaviour.
• Built Help desk Chatbots using Google DialogFlow and Created messenger Chatbots for clients. Created chatbots using Google DialogFlow and AWS LEX
• Guided end users of Clients in using Power BI, Tableau and developing and publishing dashboards, sharing them with other colleagues and how to embed them on their websites and Created Custom Visualizations using R.
• Used Python with Sckit-Learn and Tensor flow to build machine learning model linear regression to figure out the relationship among attribute and products sales.
Environment: MySQL, DbVisualizer Pro, Salesforce, API's, Agile Scrum, Jira Board, ServiceNow, MS Office Tools, XML Notepad, UML, MS Visio.
Client: Markel Corporation, Richmond Virginia, USA Apr 2022 - Aug 2023
Role: Data Analyst
Description: Markel Corporation is a global insurance company specializing in specialty insurance and reinsurance. Managed the development and deployment of predictive models and chatbots, leveraging data analysis and machine learning techniques to enhance customer insights and support.
Responsibilities
• Predicted the customer behaviour, randomly split datasets into training and test data for cross-validation and prevent from over fitting problems in Python with Scikit-learn package.
• Built machine learning models such as Logistic Regression for credit score prediction, attributed credit level by posterior probabilities in Python.
• Applied advanced classifying algorithms such as Decision Tree, Random Forests, SVM and Gradient Boosting to training data using Scikit-learn and NLTK packages in Python.
• Used RMSE to evaluate models how well it performs in this data sets in Python.
• Used Kubernetes control to deploy the images to Google Cloud, describe the pod information and get the deployment information.
• Coded machine learning solution with PythonAnaconda to predict Customer behavior.
• Built Help desk Chatbots using Google DialogFlow and Created messenger Chatbots for clients. Created chatbots using Google DialogFlow and AWS LEX
• By using Google Looker Design and implement impactful business-like visualizations, dashboards, and reports to deliver consumer insights
• Created some customer support chatbots using Google DialogFlow Enterprise version, where the webhook fulfilment is done with the help of Google Cloud Platform.
• Created the chatbots with the help of Firebase that respond back with the photos and videos. Environment: MySQL, DbVisualizer Pro, Salesforce, API's, Agile Scrum, Jira Board, ServiceNow, MS Office Tools, XML Notepad, UML, MS Visio
Client: Unilever, Bangalore, India Dec 2019 - Nov 2021 Role: Data Analyst
Description: Unilever PLC is a British multinational fast-moving consumer goods company. Contributed to data profiling, ETL development, and data warehouse design, enhancing data management and reporting capabilities. Responsibilities:
• Performed data profiling in the source systems that are required for Dual Medicare Medicaid Data Mart.
• Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
• Responsible for ETL development with successful design, development, and integration of components within the Talend ETL Platform and Java Technology.
• Responsible for management of data, duplicating the data and storing the data in to specific data warehouse using Talend Platform for Data Management and MDM.
• Involved in defining the trumping rules applied by Master Data Repository.
• Define the list codes and code conversions between the source systems and MDR.
• Worked with internal architects and, assisting in the development of current and target state enterprise data architectures.
• Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
• Create data cleanup routines using Google App Script, Google Sheets Advance Formulas, for use in Google Data Studio.
• Wrote custom Google Sheet Utilities to send HTML formatted email based on sheet data.
• Created basic Google Docs creator that converted Google Sheets data to a Google Doc then to PDF for email distribution
• Reverse engineered all the Source Database's using Embarcadero.
• Involved in Data Warehouse and Data Mart design.
• Working from existing legacy dashboards and new user requirements, replicating/enhancing existing dashboards in Google Looker.
• Experience with various ETL, data warehousing tools and concepts.
• Performed Data Analysis and Data validation by writing complex SQL queries using TOAD against the ORACLE database. Environment: MS Excel, MS Access, Oracle 10g, UNIX, Windows XP, SQL, PL/SQL, Power Designer, Informatics, HDFS, PIG, HIVE, Map Reduce, Linux, HBase, Flume, Sqoop, Matlab, SAS Client: Beckman Coulter Diagnostics, Bangalore, India Aug 2017 - Nov 2018 Role: Programmer Analyst
Description: Beckman Coulter Diagnostics is a leading provider of clinical diagnostic solutions that enhance healthcare outcomes. Involved in data processing, analysis, and model development to enhance reporting, automation, and customer support functionalities. Responsibilities:
• Work with scalable distributed data processing, data management, and data visualization tools including Accumulo, Hadoop, Kafka, and various graph databases.
• Interact with the SME's to analyze the data extracts from Legacy Systems Mainframes and COBOL Files and determine the element source, format and its integrity within the system
• Proficiently use statistical programming software such as STATA, Python, SPSS, R, SAS (STATA preferred) to create statistical, regression, and econometric models and Generate statistical models and advanced code for conducting student outcome analyses.
• Applied advanced classifying algorithms such as Decision Tree, Random Forests, SVM and Gradient Boosting to training data using Scikit-learn and NLTK packages in Python.
• Conducted feature engineering, model diagnosis and validation, adjust parameters by cross-validation and grid research in python.
• Create data cleanup routines using Google App Script, Google Sheets Advance Formulas, for use in Google Data Studio.
• Wrote custom Google Sheet Utilities to send HTML formatted email based on sheet data.
• Provide quality assurance of imported data, working with quality assurance analysts if necessary and Commissioning and decommissioning of data sets.
• Processing confidential data and information according to guidelines and Helping develop reports and analysis Managing and designing the reporting environment, including data sources, security, and metadata.
• Built Help desk Chatbots using Google DialogFlow and Created messenger Chatbots for clients. Created chatbots using Google DialogFlow and AWS LEX
• By using SQL quiries integrates data from several datasets into one database to access streaming information.
• Knowledge of best practices for Dashboard Design and Strong understanding of source data optimization for Google Looker Processing
• Created some customer support chatbots using Google DialogFlow Enterprise version, where the webhook fulfilment is done with the help of Google Cloud Platform.
Environment: MS Excel, MS Access, Oracle 10g, UNIX, Windows XP, SQL, PL/SQL