Manmayee Thibirisetti Data Analyst
Phone: +1-508-***-**** • Boston, MA • Email: ************@*****.***
https://www.linkedin.com/in/mayee-thib/
PROFESSIONAL SUMMARY
•6+ years of Data Analyst experience in data collection, cleaning, transformation, visualization, modeling, and interactive report generation across OLTP/OLAP environments.
•Proficient in SQL, PL/SQL, Teradata SQL, with hands-on experience writing complex queries, stored procedures, triggers, functions, and packages for data analysis and reporting.
•Experienced with SSIS, Informatica, DataStage, and Azure Data Factory for building ETL pipelines to extract and load data from MySQL, Oracle, SQL Server, DB2, flat files, and cloud sources.
•Skilled in data modeling (conceptual, logical, physical), including dimensional modeling (star/snowflake schemas), and tools like ERwin, ER Studio, and Model Mart.
•Built interactive, dynamic dashboards using Tableau and Power BI, incorporating advanced features like DAX, LOD expressions, slicers, drill-downs, and cross-filtering.
•Created marketing, financial, and supply chain dashboards, improving campaign performance tracking, KPI visibility, and operational decision-making.
•Applied Python and R for data wrangling, predictive modeling (ARIMA, LSTM, Ridge Regression, Random Forest), and ETL automation, integrating with Apache Airflow and Jupyter Notebooks.
•Integrated structured/unstructured data using AWS S3, Azure Blob Storage, Redshift, Snowflake, and BigQuery, ensuring scalable analytics infrastructure.
•Hands-on with data profiling, data reconciliation, cleansing, and validation techniques to ensure high-quality reporting and analytics.
•Designed and optimized data pipelines, schema transformations, and performance tuning in Teradata, Snowflake, and SQL Server.
•Performed GAP analysis, AS-IS/TO-BE documentation, and transition strategy formulation for data migration and modernization projects.
•Adept in business process modeling, UML, and tools like Rational Rose for documenting use cases, user stories, workflows, and diagrams.
•Delivered projects using Agile/Scrum methodologies, tracking development through Jira, version control via Git/GitLab, and collaborating with cross-functional teams.
•Strong knowledge in data governance, metadata management, and MDM, with focus on compliance, security (row-level security in Power BI), and auditing.
•Worked with MS Office Suite, Excel PivotTables, SAS Enterprise Guide, Flask, Kafka, Docker/Kubernetes, and BI and analytics tools
Certifications:
•Power BI Data Analytics for All Levels 2.0
•SQL - Advanced for Data Professionals
•Advanced SQL: MySQL Data Analysis & Business Intelligence
•Data Analysis: Beginner MySQL Business Intelligence
•Data Manipulation with Pandas
•Joining Data with Pandas and Python
EDUCATIONAL QUALIFICATION:
UNIVERSITY OF MASSACHUSETTS, BOSTON, MA / Master of Science in Business Analytics and Data Science (GPA:3.69/4)
GVPD COLLEGE OF ENGINEERING, INDIA / Bachelor of Science in Fintech (GPA:3.4/4)
Relevant Courses: Applied Statistics, Machine Learning for Finance, Python for Financial Applications, Financial Econometrics (Time Series), Computational Finance, Stochastic Calculus, Database design, R for Finance, Risk Management
TECHNICAL SKILLS:
Primary Tools: Informatica Power Center 9.0.1/8.6/8.1, Ab Initio (Co>Op 3.0.3.9/2.15/2.14, GDE 3.0.4/1.15/1.14), Teradata SQL, Teradata Tools and Utilities, Oracle 10g/9i, MS SQL Server 6.5/7.0/2000
Languages: Teradata SQL, SQL, PLSQL, C/C++, Python, R, Bash
Databases: Teradata 14/13/12/V2R6.2, Oracle 10g/9i, DB2/UDB, SQL Server, MS Access 2010/2007/2003, MySQL, PostgreSQL, MS SQL Server, MongoDB, Hive, Presto, AWS RDS, AWS Redshift, AWS Redis, Big Query
Operating Systems: Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX
Data Modeling: Erwin, ER Studio
Tools: Tableau, Hadoop, Hive, Apache Airflow, Apache Spark, Flask, Apache Kafka, Jupyter Notebook, Excel, Jira, Git, Docker, Kubernetes
Data Warehousing Informatica 9.1/8.6/7.1.2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SSIS, Data Stage 8.x
Scheduling tools: Control M, Autosys
Tableau: Tableau Desktop 6/7/8/9, Tableau Server 5/6/7/8/9, Tableau Online, Tableau Public, Tableau Reader.
Servers: Windows 2003, 2008, 2012R, Microsoft SQL Server 2012/2008-R 2/2008/2005.
PROFESSIONAL EXPERIENCE
Client: M&T Bank – Buffalo, NY, (Aug 2024 – Present)
Role: Data Analyst
Projects:
1. Enterprise Financial KPI Automation & Insights
2. Credit & Marketing Intelligence Dashboard
3. Predictive Forecasting Model for Business Planning
4. Customer Behavior and Revenue Optimization Platform
Responsibilities
•Collaborated with cross-functional Agile teams, including delivery managers, business analysts, Scrum Masters, and others, to gather user stories and deliver reports and dashboards aligned with business requirements.
•Analyzed large datasets using SQL, Python, and SAS to support data-driven decision-making, KPIs, and financial/business reporting.
•Designed and developed interactive dashboards in Power BI and Tableau using DAX, LOD, advanced calculations, and visuals for marketing, sales, and strategy teams.
•Integrated and transformed data from multiple sources (Oracle, SQL Server, Excel, flat files, ERP) using SSIS, AWS Glue, Azure Data Factory, and Apache Airflow.
•Built and optimized ETL pipelines, Star/Snowflake schemas, and data models using ERWIN, Snowflake, AWS S3, and SQL DDL scripting.
•Led efforts in data profiling, cleansing, validation, reconciliation, and migration from staging to integration environments across OLTP and ODS systems.
•Coordinated with DBAs, architects, and stakeholders to resolve middleware issues, improve system performance, and ensure data accuracy.
•Collaborated with Agile cross-functional teams (PMs, BAs, Scrum Masters) to gather business requirements, write Jira user stories, and document use cases and process flows.
•Developed and deployed time-series forecasting models (ARIMA, LSTM) in Python to predict financial KPIs, enabling more accurate budgeting and strategic planning.
•Automated anomaly detection for financial data streams using AI models in Python, reducing manual reconciliation efforts by 35%.
•Maintained version control for Power BI and Python ETL scripts using GitLab, while also documenting Airflow DAGs and architecture artifacts.
•Automated reporting workflows and ETL processes using SQL, Python, and Airflow, enabling timely insights and reduced manual intervention.
•Created and published dashboards with scheduled refreshes and security roles in Power BI Service, enabling near real-time data access for end-users.
•Modeled and analyzed data with tools like SAS Enterprise Guide, SAS/BI, Tableau Server, and Excel PivotTables for ad-hoc and operational reporting.
•Conducted functional and technical analysis to modernize legacy systems, enhance reporting architecture, and align with evolving business strategies.
•Developed and executed test cases in Agile environments for ETL validation, data accuracy, and performance across multiple source systems.
•Implemented ML workflows using Jupyter Notebooks and Apache Airflow, ensuring scheduled training, model versioning, and performance monitoring.
•Collaborated with data science teams to build NLP-based classification models, improving document tagging and reducing processing time in operational workflows.
•Supported and mentored junior analysts, enforced data governance practices, and ensured CMS compliance.
•Delivered source-to-target mappings, financial report automation, and predictive analytics using tools like Power BI, Python (ARIMA), SQL, and Snowflake.
Client: Pegasystems / Cox Automotive (Jan 2024 – July 2024)
Role: Data Analyst
Projects:
1. Plan Procure develop Analytics and Insights
2. Supply Chain Risk Analysis Dashboard
3. Supply chain fulfillment Dashboard
4. Optimized warehouse estimator dashboard
Responsibilities:
•Provided business strategy recommendations through data analytics and process automation
•Delivered business strategy insights by automating reports and building interactive dashboards using Power BI, Tableau, and Excel, boosting operational efficiency by 30%.
•Extracted and transformed data from Oracle, flat files, and mainframes using PL/SQL, SQL, and ETL scripts; developed complex stored procedures and optimized queries for performance.
•Built and published Power BI dashboards with calculated columns, DAX measures, Power Query, and role-based security; automated refresh and scheduling via Power BI Service.
•Created ARIMA-based Python models for warehouse capacity forecasting and integrated them into Power BI for predictive analytics.
•Automated ETL pipelines using Python and Airflow, reducing report generation time by 40% and improving data availability for stakeholders.
•Developed and maintained technical design documents, wireframes, and POCs based on business and functional requirements gathered from TPMs and stakeholders.
•Designed data models using ERwin, including reverse engineering from databases and ODS systems; published models in Model Mart for reuse and collaboration.
•Migrated dashboards and applications to Azure, and managed data validation, UAT testing, deployment, and production support processes.
•Conducted data profiling, cleansing, and reconciliation across systems to ensure report accuracy; led data correction using utilities and validated data conversions.
•Developed KPIs, created TDE extracts, and generated ad-hoc visualizations in Tableau for legal, billing, and strategic planning use cases.
•Integrated structured and unstructured data sources into AWS S3, enabling centralized data storage and scalable analytics across supply chain dashboards.
•Developed custom Python scripts to transform and cleanse data from Oracle and flat files, improving data quality and reliability for dashboard reporting
•Built and scheduled ETL pipelines using AWS Glue to transform raw data from Oracle and flat files into analytics-ready formats for Power BI and Tableau.
•Leveraged Amazon Redshift for high-volume data warehousing, enabling efficient query execution for supply chain risk analytics.
•Created and maintained source-to-target mappings, mapping documents, metadata models, and naming conventions for data warehouses and staging areas.
•Implemented Python-based anomaly detection routines to flag inventory discrepancies across multiple data sources, reducing manual checks by 30%.
•Integrated structured and unstructured data into the Snowflake database and conducted quality checks using SQL and validation scripts.
•Translated SAS scripts to Snowflake, migrated Teradata data, and executed complex queries involving joins and subqueries from remote databases.
•Designed and implemented Power BI reports with dynamic filters, subscriptions, and mobile app publishing; managed stakeholder access and collaboration via Power BI workspace apps.
•Investigated root causes of data discrepancies across systems, conducted downstream analysis, and provided resolution strategies based on business rules and models.
•Coordinated with DevOps teams to containerize ETL workflows using Docker and deployed them on AWS ECS, enhancing pipeline reliability and versioning.
•Created SAP End User Training Plan to train SAP end-users in SAP basic navigation to in-depth analysis of Asset Accounting/Financial Accounting, Consolidations, GL financials, PS, and logistics processes. Assessed training community, locations, and preliminary risks/issues.
•Built and optimized ETL pipelines to extract structured data from S/4HANA and transform it into analytics-ready datasets ad ensured data quality and reconciliation for critical S/4HANA modules across OLAP and OLTP systems.
Client: FactSet – India (Aug 2019 – July 2022)
Role: Data Analyst
Projects:
1.Consumer Investment Trend Dashboard
2.Portfolio Health Monitoring and KPI Reporting
3.Fund Performance Analytics Suite
4.Real-Time Financial Market Insights Platform
Responsibilities:
•Data Analysis: Analyzed the data and developed a set of dashboards using Power BI for the Consumer Marketing
•Executives consumed the team to make valuable business decisions.
•Developed and maintained Power BI dashboards and scorecards for KPIs, MOM, YOY, QOQ trends, and financial insights using DAX, R Script, and SQL.
Utilized Python (pandas, numpy, matplotlib) for exploratory data analysis (EDA) and preprocessing of financial data prior to visualization in Power BI.
•Integrated R and Python scripts within Power BI dashboards, providing enhanced analytics such as YOY and MOM trend detection.
•Developed and tested Python modules to calculate financial ratios, moving averages, and volatility metrics for equity and bond portfolios.
•Collaborated with data engineering teams to implement Python-based streaming data ingestion workflows using Azure Stream Analytics for real-time reporting.
•Designed complex SQL queries, stored procedures, CTEs, views, and PL/SQL triggers to support advanced data modeling and reporting needs.
•Gathered and analyzed business requirements from stakeholders, SMEs, and BAs to design robust BI solutions aligned with SDLC stages.
•Integrated and transformed data from OLTP systems using SSIS, Azure Data Factory, and Azure Stream Analytics for real-time and batch analytics.
•Published and maintained Power BI reports on the server, implemented row-level security, automated dataset refresh, and managed role-based access.
•Created Power BI data models, established source connections (including Azure Blob Storage), and ensured data consistency during migrations.
•Authored technical design documents, unit test cases, and user guidance documentation; participated in Agile stand-ups for ongoing sprint enhancements.
•Conducted ETL optimization, real-time data streaming, and performance monitoring, ensuring accurate and timely reporting.
•Resolved data duplication and integrity issues through data profiling, cleansing, reconciliation reports, and inter-department coordination.
•Implemented Tableau dashboards, managed users and permissions on Tableau Server, and integrated dashboards for centralized portal access.
•Used Erwin for forward/reverse engineering, conformed dimension modeling, and legacy-to-modern system transitions via structured data architecture.
•Delivered dynamic, interactive visualizations with Power BI features like calculated tables, computed columns, and Power Query pivot/unpivot.
•Created and validated data migration strategies, reconciliation reports, and data dictionary tracking for transformation projects.
•Collaborated with project managers using Work Breakdown Structures, planned activities, and executed UAT testing across data migration cycles.
•Mentored peers in PL/SQL, conducted KT sessions on finance/banking data domains, and ensured team alignment with documentation and QA standards.
Client: Lextron IT – India (June 2018 – July 2019)
Role: Jr. Data Analyst
Project:
1. Retail Client Insights & Mutual Fund Analytics
2. Sales and Service Data Integration Framework
3. Credit Risk Dashboard for Loan Performance
4. Customer Relationship & Financial Health Tracker
Responsibilities:
•Managed client data using Salesforce, created personalized financial plans, and conducted mutual fund performance tracking and financial research using Excel and Bloomberg tools.
•Defined business transformation rules, data mappings, and source-to-target interfaces for sales and service data across ODS, OLTP, and OLAP systems.
•Developed and maintained Power BI dashboards using normalized/de-normalized data, DAX, calculated columns, and configured row-level security to protect sensitive information.
•Gathered business requirements and authored BRDs, leading to a 40% faster Power BI development cycle; performed gap analysis and supported legacy system migrations.
•Created and optimized SQL and PL/SQL scripts for data extraction, transformation, and database indexing; conducted ETL using Base SAS and custom scripts.
•Performed data profiling, cleansing, and validation using Talend, Informatica, and SQL, ensuring data accuracy, quality, and integrity.
•Participated in metadata management, data governance programs, and data quality initiatives including compliance with internal controls and audit requirements.
•Built source-to-target mapping documents, transformation rules, and data dictionaries; used Erwin for logical/physical modeling and reverse engineering.
•Migrated data from Oracle to Teradata, designed reconciliation reports, reviewed migration scripts, and supported batch job scheduling.
•Delivered training, documentation, and compliance reviews; supported test case execution, QA validation, and smooth transition to production environments.