Yeshwanth Kumar Mutcherla
Senior Data Analyst
South Lyon, MI
Summary:
• Yeshwanth Mutcherla is a Senior Data Analyst with expertise in using Tableau for interactive dashboard design and Python for scripting and data mining.
•Designed interactive dashboards in Tableau for Finance team on Revenue & Profitability, Expense Analysis, Cash Flow Analysis, Budget Vs Actuals, Financial Forecast and KPI’s.
•Queried and processed large datasets from Snowflake to generate insights for Customer Facing reports for underwriting, claims, and policy management.
•Utilized data mining techniques to extract and refine data from various sources, including data warehouses and BigQuery on GCP, to automate data processing for commercial insurance carriers and brokers.
•Automated repetitive tasks, such as data validation, policy comparisons, and claims processing, using Python scripts.
•Created and modified T-SQL queries for data mining to validate/support business decisions.
• Yeshwanth is local to South Lyon, MI and can begin a new remote role within 2 weeks. Experience:
CONVR, INC
Chicago
Senior Data Analyst
Nov 2023 – Till Date
• Designed interactive dashboards in Tableau for Finance team on Revenue & Profitability, Expense Analysis, Cash Flow Analysis, Budget Vs Actuals, Financial Forecast and KPI’s.
• Collaborated with stakeholders to understand business requirements and translate them into analytical solutions.
• Optimize Tableau extracts and live connection to Snowflake for faster Dashboard Performance.
• Implement Performance tuning techniques such as indexing, partitioning and query tuning.
• Query and process large datasets from Snowflake to generate insights for Customer Facing reports for underwriting, claims, and policy management.
• Made sure data integrity, accuracy, and completeness across various sources, including policy records, claims data, customer interactions, and external market trends.
• Performed exploratory data analysis (EDA) to identify trends in claims frequency, loss ratios, and customer behavior.
• Utilized Python (Pandas, NumPy, Scikit-learn) for statistical analysis, anomaly detection, and predictive modeling.
• Automated repetitive tasks, such as data validation, policy comparisons, and claims processing, using Python scripts.
• Optimized SQL queries in Snowflake for improved data retrieval performance on large datasets.
• Collaborated with IT and engineering teams to enhance data warehousing solutions for faster insights generation.
• Written efficient Python scripts for data cleaning, automation, and advanced analytics.
• Developed rMDM(Reverese MDM) approach to get back the data from Data Lake and organizing the data into meaningful insights.
• Optimize queries and data models to improve performance and scalability.
• Ensure data integrity and accuracy by implementing best practices in data governance.
• Support ad-hoc data requests and provide actionable recommendations based on data findings.
GROUNDSPEED ANALYTICS, INC
Michigan
Data Analyst
March 2018 – Nov 2023
• Used data mining techniques to extract and refine data from various sources, including data warehouses and BigQuery on GCP, to automate data processing for commercial insurance carriers and brokers.
• Developed Clean datasets for Machine Learning team to build and train existing and new data models
• Wrote scripts in python to clean the data and help the solutions engineering team to clean up the rowan extracts for clients
• Queried data from AWS redshift from different data sources to gather the corpus datasets, Performed Data mapping of user interface data requirements to enterprise services
• Cleaned raw data using NumPy and pandas in python.
• Developed logic to clean the Drivers names in the rowan extracts for clients.
• Create Prodigy task and use the prodigy task for high level annotations.
• Maintained and improved our data reporting pipeline, primarily using SQL.
• Coordinated and facilitated extensive user interviews and workshops for gathering requirements and analyze and translate them into Features & User stories
• Maintained and improved current Tableau dashboards based on user requests.
• Coordinated, prioritized sprint deliverable in an Agile Methodology. Led daily backlog refinement & sprint retrospectives discussions in a fast-paced environment.
• Migrated data from Google Big query to Amazon Redshift.
• Developed Logical and Physical data flow models for ETL applications.
• Configured and administered Version One as the team’s Agile management & reporting tool.
• Created ad-hoc analysis/reports when needed, often under strict deadlines.
• Gathered requirements and planning sprint meetings for Claim Labeler, Claim stamper and Shred Workflow.
• Created clear documentation for new processes/metrics/reports.
• Developed Visual reports, dashboards and KPI scorecards using Power BI desktop.
• Developed model used for Claim Level Data Extraction.
• Performed word processing and data analysis utilizing MS office related applications. Created excel macros Pareto charts/graphs to analyze. And arrange safety data reports.
• Updated Python scripts to match training data with our database stored in AWS Cloud Search, so that we would be able to assign each document a response label for further classification.
• Used JSON schema to define table and column mapping from S3 data.
• Performed Data Mapping of user interface data requirements to enterprise services.
• Collaborated with the QA team to ensure adequate testing of software both before and after completion, maintained quality procedures, and ensured that appropriate documentation is in place.
UNIVERSITY OF MICHIGAN DEARBORN.
Michigan
Graduate Student Assistant (GSA)
Jan 2016 – Dec 2017
• Reviewed, analyzed, and ensured the quality of data loaded into the database system by utilizing T-SQL and Excel.
• Created and modified T-SQL queries for data mining to validate/support business decisions.
• Generated and run SQL queries to further pinpoint client's data issue and work with third party data vendors to obtain corrected files.
• Provided quality support for client data integration
• Monitored data integrity by performing quality assurance checks on client and partner data.
• Participated in the early stages of large project development to ensure consistent approaches in the technical design across projects and programs. Worked closely with the business partners, assembled, and converted business requirements into the complex T-SQL scripts and stored procedures.
• Identified and analyzed expensive T-SQL stored procedures by using the SQL Profiler, SQL analyzer, Index tuning wizard and ran homecooked SQL scripts, created the necessary indexes and SQL hints to improve the T-SQL query performance.
• Configured SQL Server maintenance plans to automatically backup SQL Server databases, optimized indexes and updated statistics as needed.
• Performed Data Mapping of user interface data requirements to enterprise services Education:
University of Michigan
Master of Science in Computers & Information Science Dec 2017
Tools and Technologies:
Analysis & Querying Language:
Oracle SQL\PLSQL, Python (Pandas, NumPy and SciPy) Data Visualization Tools:
Tableau 10.3, Data studio, Power BI
Business Intelligence Platforms:
IBM Infosphere DataStage 7.5, 8.1(DataStage, Quality Stage) Server, Parallel (EE). Databases/Datawarehouse:
SnowFlake,Big Query, Redshift, MS SQL, Mongo DB