SATWINDER BHASIN
AGILE BUSINESS ANALYST
R E L E V A N T S K I L L S
Senior Agile Business Analyst with 11+ years of experience in business, data and system analysis including Functionality/System Testing and UAT Testing
Excellent knowledge of Business Intelligence principles and reporting, data mapping, data lineage, data integration, data migration & data quality.
Excellent experience in working with Data Warehouses, data governance, data models, MDM, and data quality
Experience with SDLC processes, System Design, and documentation, writing Test Plans, Test Cases, and documenting test results
Familiar with Fannie Mae’s Multifamily Selling and Servicing guide
Experience with HP ALM for Requirements, Test Plan creation, Test Execution and Defect management
Excellent experience in SQL, MS Excel, and Cognos 8.3 ReportNet Authoring tool
Excellent knowledge of Primary and Secondary Mortgage business. Familiar with Multifamily and Single-Family loan level data
Working experience with multi-dimensional data and supporting the reporting requirements (using the data marts) for different departments
Familiar with the bonds trade order, clearinghouse, and settlement process.
Familiar with Power BI/Tableau/DataStage/Informatica ETL tool
Support Marketing Research team with Data Mining/Data requirements/ Marketing Reports
Experience using the Microsoft Copilot for code completion, debugging the pre written SQL codes
and finding the solutions for the business process improvements based on the internal business process. Also well versed with the prompt engineering for (OpenAI Chatgpt, Claude, Gemini, Perplexity)to get more effective solutions from AI.
Experience with SAFe – ART (Agile Release Train) methodology
Experience with Agile and Waterfall SDLC methodologies
Experience in Quality Assurance and SOX auditing
Supported the Compliance team’s reporting efforts for FINRA regulations
Familiar with the general and the FASB accounting principles
R E C E N T T O O L S
Tools/Technologies Power BI Reports 2.114, Dataiku DSS (AI) tool v9.0.4, Azure Cloud, ADLS Gen2 Storage, AWS RDS, AWS S3, Salesforce (CRM), SAP R/3, 4.6c, RHEL7.9, UNIX (HP-UX 10x, Solaris 2.x), SAP FICO, Window 2000, 2003 & 2007 Enterprise Edition, DB2, Oracle 12c, Sybase Adaptive Server Enterprise 12.5.1, Rapid SQL v7.4, SAS 9.0, Java 1.5, VBA, ClearQuest, ClearCase, DOORS, Jira, Jenkins, JMS messaging (Tibco Gems 5.1), Pivotal Tracker, Remedy, Snag-It, SharePoint, MS Visio, MS Access, MS Office Suite, SQL (Toad 9.5), Minitab, XML, BOXI, Cognos 8.3 Report-net, ALM Defect Management, Technical Writing, Creating User Manuals, How to guides, Training users on new Systems.
R E C E N T E X P E R I E N C E
CLIENT: FANNIE MAE, VA (REMOTE) NOV 2023 – APR 2024
Project: MF Mortgage Data and Reporting – Business / Data Analyst
Participated in the sprint planning meetings to understand the scope of the user stories to be developed in the sprint.
Supported the reporting team for their FHFA reporting requirements.
Participated in stakeholder’s meetings to understand the user stories implementation.
Set up Scrum board in Jira and Confluence for the project.
Created Epics/Feature/User Stories/ tasks in Jira.
Created use case scenarios for the client’s demo.
Created SQL queries to perform data analysis.
Conducted Business Requirements elicitation sessions.
Used MS Copilot for the code debugging.
Used MS Copilot for the business process improvement based the proprietary AI tool.
Participated in daily Scrum meetings.
Created acceptance criteria for the tester as required by the user story.
The user stories were documented in Jira.
Helped in creating tailored workflows to suit specific project needs, ensuring flexibility in task management.
Used Confluence for Storing all project-related documents in one place for easy access and collaboration and the document’s version control.
In Jira assigned tasks with due dates and tags, keeping teams aligned on deliverables
Produced flowcharts and process diagrams using Visio.
Used AWS RDS and AWS S3 for the data analysis.
Used AWS Glue Data catalog to manage metadata for the data discovery and preparation for data Processing and Querying.
Created Requirement Traceability Matrix, Design, Architecture, Requirements and Testing phase documents for the project.
Manually tested different application functionalities: Deal Registration, Commitments, Collateral, Participants, Hedges, Loans, Bonds, Financing Options, and Change Requests.
Created STTM documents for the redesigned database tables.
Performed data analysis before and after the implementation to find out any anomalies.
PCI (REMOTE- VA) JUN 2022 – MAR 2023
Project: NFP Fidelity/Client Risk Profile/ MIECHV Reports (Power BI – Dashboard)
Product Owner / Business Analyst
As a Product Owner/ Business analyst and member of the Agile development team was responsible for ensuring that the product being developed meets the needs of the users and provides value to the business.
Created and communicated a clear product vision to align the development team and stakeholders with business goals.
Defined and maintained the product backlog, ensuring it is prioritized according to business value and customer needs.
Developed user stories and acceptance criteria to guide the development team on what to build.
Prioritized the backlog items, balancing scope, budget, and time, and make trade-offs to optimize product value.
Helped the team with the Product roadmap, Product Backlog, Requirements Gathering, Prioritization, Sprint Planning, Acceptance Testing, Stakeholder Communication and Continuous Improvement.
Provided technical and solutions for the improvement of the Power BI reports.
Did work with the stakeholders to elicit and analyze the requirements and convert them into the user stories.
Build the Fidelity Reporting to support the HR for nurses to measure the effectiveness of the program for optimizing the performance of the nurses participating in the program.
-The performance metric was based on multiple factors like the experience, education, skill level, Training and development, Engagement score, meeting attendances and other demographic variables. Nurses for the all the agencies were compared at Agency level, state level and national level.
-Various beneficiaries survey data points were taken into consideration for generating these reports.
-Performed data analysis by creating SQL queries and adhoc reports.
-The report would provide the fidelity of the program along with the under staffing / over staffing, skills weaknesses and strength of the resources being deployed by the HR to meet the specified goals.
-Client Risk Profile: This report helped the HR to meet the staffing needs to address the beneficiaries need by profiling the risks factors of the beneficiaries based on the specific metrics.
-MIECHV: This was a report based on longitudinal study where the effectiveness of the resources (nurses) was analyzed over a long period for specific beneficiaries.
Explained the user stories to the devs and OA resources.
Managed the Azure devops for Epics, Features, User stories product backlog, board, sprints.
Ran the daily standup, sprint planning, sprint retrospective.
Used HP-ALM for the defects management.
Ran the QA status meetings
Managed the product development team (Devs, testers and Cloud team (outsourced)).
Analyzed the data for generating the Power BI dashboard/Reports.
Helped the team with issues related to data or understanding of the user stories with the help of Business process flow etc.
ADS, (REMOTE - VA) JUN 2021 – APR 2022
Project: MAT (Modern Analytics Tool – Dataiku DSS (Enterprise AI)
Agile Business Analyst
As part of Operation Research and Analytics team - Completed Dataiku Discovery, Designer and Deployer training.
Deployed projects from Dataiku DSS node to the Automation node in Azure cloud environment.
Worked with Data Architecture team to define the data ingestion by the Dataiku platform for the Machine learning.
Helped the end users with the data access and the data clean up using the Dataiku built-in tools.
Helped the end users with resolving any data anomalies or missing data scenarios.
Helped the UIX team with wireframing inputs.
Performed data analysis and data clean up by using the Dataiku platform.
Used Python plugins for creating new projects.
Created test cases for UAT testing of Dataiku DSS, and Automation nodes.
Designed different recipes to meet the business requirements.
On-boarded 220 new users to the Dataiku DSS tool successfully and provided the Prod support.
Participated and conducted daily Scrum meetings and conducted demos as part of sprint review.
Also used Databricks and Yellowbrick as data sources.
ADS Data mart elements mapping to FISERV Data mart.
FISERV to Yellowbrick data mapping.
Conducted sessions with FISERV to understand and accurately complete the data mapping to the base tables.
Used Azure Blob Storage ADLS Gen2 Storage, Network File System (NFS), Hadoop Distributed File System (HDFS), FT P/SFTP, SSH.
CGI FEDERAL INC, GINNIE MAE, VA / DC AUG 2020 – FEB 2021
Project: DAMS (Digital Asset Management Solution)
Agile Business Analyst
Set up Kanban/Scrum boards in Jira and Confluence for the project.
Created Epics/Feature/User Stories/ tasks in Jira.
Created use case scenarios for the client’s demo.
Conducted Business Requirements elicitation sessions.
Created User stories based on the business requirements.
Performed data analysis of the existing scanned documents data by using SQL.
Created indexes for faster search by adding the metadata related to the documents.
Provided information to the data architecture team about the data ingestion and end users data access.
Participated in meetings with client and created meeting minutes.
Participated in product backlog prioritization meetings with product owner.
Helped in Content Modeling for Alfresco as Digital Asset Management solution.
Worked in DevOps and AWS (RDS, S3) cloud environments.
Acquired knowledge about client’s mortgage-backed securities (MBS).
EXELON (AS A CGI CONTRACTOR) APR 2017 – JUN 2020
Project: PCAD Application- System Analysis and Testing
Lead Quality Analyst (DE/ VA)
Led the PRAGMACAD application testing team at Exelon.
Defined the Test plan along with the Test Cases for Release 1A.
Supported the defect management process where in defects with System & Integration testing are reviewed and resolved.
Conducted meetings with developers/testers and client management teams responsible for managing the defects and reporting to the upper management.
Participated in the sprint planning meetings to understand the scope of the user stories to be tested in the sprint.
Mapped and tested Config data as required by the application.
Created SOP document for the operations team.
Created test data for testing.
Used SQL for data analysis.
Produced flowcharts and process diagrams using Visio.
Created Requirement Traceability Matrix, Design, Architecture, Requirements and Testing phase documents for the project.
Participated in DAT meetings to understand the user stories implementation.
Participated in daily Scrum meetings.
Created test plan and test data as required by the user story.
Manually tested different application functionalities
Created Test sets in HP ALM.
Tested XML payloads to be consumed by the users of the application
CLIENT: FANNIE MAE, HERNDON, VA (AS A CGI CONTRACTOR)
Project: CnD Application – Business process analysis, System Analysis, Data Analysis and Testing
Participated in the sprint planning meetings to understand the scope of the user stories to be tested in the sprint
Participated in DAT meetings to understand the user stories implementation.
Participated in daily Scrum meetings.
Created test data as required by the user story
Performed data analysis by using SQL queries.
Manually tested different application functionalities: Mortgage Deal Registration, Commitments, Collateral, Participants, Hedges, Loans, Bonds, Financing Options, and Change Requests
Created Test Cases and captured the test results evidence
Tested XML payloads to be consumed by the users of the application
Tested Green Financing / ACheck micro services
Performed Regression Testing for the Security Matrix
Performed data analysis before and after the implementation to find out any anomalies.
Project: WebLogic to JBoss Conversion (eSSO to Ping & eServicing Apps)
Setting up of Jira and Confluence for the project and the project documents
Created high level tasks in Jira to accomplish the conversion: SVN Branch and Icart setup/ Local setup/Devl /Test/Acpt Environments Setup
Created Change Impact Analysis (CIA) document
Created project scope documents
Updated vision / System Context documents
Coordinated with the Developers and testers for the daily meetings
Used the Kanban methodology
Project: CLASS Program
Analysis of unstructured data in the loan documents to create the AI solution for the Selling & Servicing Guide.
Finding all the METADATA from the raw text/values/images from different formatted mortgage documents (Txt, word, PDF, Excel)
Provided this Metadata to the OCR software and the Metadata values for generated for the machine learning models to be used eventually as an AI tool for reading the raw data coming from different data sources and combined together for the comprehensive analysis saving both time and resouces.
Analysis of Multifamily Mortgage Selling and Servicing guide
Analysis of DSCR(Debt Service Covering Ratio) and NCF (Net Cash flow) and compared to the MF Standard
Research and Analysis of different properties and the NCF applicable to them.
Project: Bond Admin - MFCS SLA – Lead Business Analyst
Wrote User Stories for the SLA between Mortgage Bond Admin and BPO
Created Acknowledgement contract between FNM and BPO
Developed rules for SLA implementation
Documented Bond Admin workflows.
Supported the Business and Development team to understand the overall system capabilities.
Participate in Solution Design meetings and assist Solution Architect to develop solution artifacts.
Help the team in creating the data model.
Analyze the inbound and outbound data payloads.
Perform data gap analysis by using SQL queries.
Create data transmission contract with the BPO.
Participate in Agile Scrum meetings and Create user stories for data requirements.
Create Data Transmission Contracts with the BPO.
Perform data analysis and created data mapping and data dictionary.
Create dashboard wireframes with client’s collaboration.
ACUMEN SOLUTIONS, MCLEAN, VA OCT 2015 – MAR 2016
Client: Capital One
Sr. Business Analyst
Small Business Banking-CRM. Replacing Siebel CRM with the Salesforce CRM. Working for the client on migrating legacy CRM to Salesforce Cloud.
Wrote User Stories for the Sales, Services and Marketing departments.
Performed customer marketing survey data analysis to map the customer needs to the products being offered by Capital One Bank.
Existing customer profile data analysis to find out the possibilities of marketing new products to them.
Participated in Solution Design meetings and assisted Solution Architect to develop solution artifacts by providing data characteristics, workflows, process approvals, specifications, and context diagrams.
Created data mapping, data lineage, data integration and data quality analysis documents.
Analyzed data integration, data migration, and data governance strategies.
Analyzed data model to be used by multiple business lines.
Performed gap analysis, removed redundant business processes, and improved existing business processes.
Based on the data requirements, created the future Data Flow diagrams, that helped the client to understand and compare the existing data flow to the future state. This also helped in performing the gap analysis.
Used HP ALM to manage the test plan, test creation and defects.
Created the future business processes diagram and compared with the existing business processes.
Conducted requirements gathering workshops with the stakeholders.
Configured Salesforce fields for Lead, Sales and Support management.
Followed an Agile methodology/framework.
MUNICIPAL SECURITIES RULEMAKING BOARD (MSRB), ALEXANDRIA, VA APR 2015 – AUG 2015
Business Analyst
Made sure the EMMA (Electronic Municipal Market Access) data is consistent and available across all the databases located in different areas. EMMA is an official repository for information on virtually all municipal bonds.
Identified data anomalies by running the canned reports, fixing the incorrect data after analysis, and reviewing the data in the source files.
Documented & updated Data Dictionary to remove any data integrity or data referential issues.
Used SQL queries to perform data analysis.
Analyzed /corrected errors in the daily reconciliation reports.
Created check scripts to find the data discrepancies between different databases.
Helped the compliance team with their regulatory reporting efforts.
Created weekly/monthly reports for the stakeholders.
Followed an Agile methodology/framework.
COLLEGE BOARD, RESTON, VA JUN 2014 – OCT 2014
Information Analyst
SAT Exam Registration and Enrollment. Captured School/Student level data for SAT exam registration/enrollment.
Created Functional Requirements Document for the Data collection for the AP, SAT, PSAT, and ReadyStep Programs.
Created new data mapping, data lineage, & data integration analysis, for the specific schema.
Created ad hoc reports in Excel for the users.
Wrote SQL queries for data analysis.
Performed UAT testing, created Test Plan, Test Cases and Test Results.
Used HP-ALM for managing the test suite and defects.
Provided Data Mapping to the ETL team to transfer the data from staging to enterprise databases.
Collaborated with Business Users, Data Architects, Data Modelers, MDM project Team, Solution Architects, Java Developers and QA teams to finalize the requirements.
FANNIE MAE, RESTON, VA JUN 2013 – JAN 2014
Business Analyst
HAMP (Treasury Program Accounting reports validation: The benefits provided to the Homeowners/Investors/Lenders under HAMP program, are processed /recorded accurately.
Created Business Requirements Document (Reverse Engineered) by converting the SQL, PL/SQL codes into the requirements document for the HAMP Accounting Validation reports.
Created new requirements document for the ad hoc reports.
Wrote SQL queries for the ad hoc reports.
Used DOORS to manage the new requirements and to review the existing reports specification.
Used UNIX commands to interact with the database and process the files to tables.
Produced the data analysis output in Excel and used macros to format the reports.
Tested the SQL codes for the updated ad hoc reports.
Performed data analysis to support the accounting team with its validation process.
E*TRADE, ARLINGTON, VA OCT 2011 – APR 2013
Business/ Data Quality Analyst
Business Intelligence Systems, Enterprise Data Warehouse, Data Warehousing, Data quality, Data Analysis, Data Requirements gathering and support. As Data Warehouse/Business Intelligence Business Analyst worked with the Data Warehouse team and with other departments to design quality customized Business Intelligence solutions utilizing established methodologies and enterprise BI tools.
Wrote Business/Data/Reporting requirements for Sales, Compliance (FINRA regulations) and HR teams.
Created SQL queries to analyze the data and to fix data quality issues.
Reviewed the Brokerage business models.
Performed analysis of Customer profile data to find the opportunities to advise them about their finances.
Performed the customer survey data analysis to find the demand for the new services to be added to their portfolio.
Worked with Multi-Dimensional Data Models (Star and Snow Flake Data Models).
Used Linux commands to interact with the DB2 database and process the files to tables.
Worked with DBA to maintain the DB2 Catalog using Catalog Manager for DB2.
Connected DB2 database using Linux CLP.
Performed data mapping, data integration, data lineage analysis and data quality checks.
Loaded data in temp tables and created indexes, added users, and granted permissions at the table level.
Coordinated with offshore development team.
Created the Software Function Requirements Document.
Responsible for UAT testing and used HP ALM for managing the testing and defects.
FANNIE MAE, WASHINGTON, DC MAR 2010 – AUG 2011
Business Analyst /Technical Writer
System Testing/ Project Quality. Business Analysis and Decision Group.
Assisted Data Modelers for the provisioning of data to run their simulations and analysis. Worked on the Pricing Engine/CreditWorks/Credit Portfolio Analysis System Applications and gathered Business Requirements and performed UAT. Worked on the Home Pricing Forecast Application and provided System Testing. Worked in Multifamily and assisted PMO for SDLC documentation.
Worked with the Multifamily development team to implement the business requirements for application upgrades.
Tracked and analyzed enhancements and defects using ClearQuest.
Implemented enhancements and defect fixes in the Pricing Engine, CreditWorks and Credit Portfolio Analysis system.
Worked with the financial modelers to capture their data requirements for running the data models.
Worked with the development team to create, interpret, and implement business requirements into technical specifications.
Coordinated with the data teams to get the daily econometrics for the financial modelers.
Created SQL queries using Toad for data analysis and ad-hoc reporting.
Interacted with the business users, system analysts and development team to track and eliminate bottlenecks.
Effectively communicated and accurately managed technical expectations.
Streamlined development and implementation life cycles.
Performed data mapping, data integration, data lineage analysis and data quality checks.
Assisted with the business process, data analysis and UAT for business models.
Analyzed defects and created meaningful reports for the development team.
Responsible for technical/business writing and project coordination.
Produced flowcharts and process diagrams using Visio.
Performed qualitative/quantitative and statistical analysis.
Created Requirement Traceability Matrix, Design, Architecture, Requirements and Testing phase documents for the Multifamily development team.
Performed System Testing for HPFA Automation application.
Used UNIX commands to interact with the HPFA application to complete the system testing.
Completed in house Agile Methodology software development training.
DELOITTE CONSULTING, WASHINGTON, DC MAY 2008 – FEB 2010
Senior Data/Reporting Analyst
Technical Requirements Review (IRS) and initial data analysis. Facilitated the review meetings among different groups.
Maintained and tracked all the Action Items in database.
Evaluated/analyzed the Technical Requirements to eliminate the duplicates, overlapping, out of scope, etc.
Coordinated with client to get feedback on the open Action Items.
Created different documents related to the project and generated reports in MS Excel.
Performed business analysis on the Web Data Collection System (MSDE), gathered requirements and held JAD sessions with stakeholders.
Created Use Cases.
Performed requirements gathering for Statistical Process Control (SPC) and Analysis.
Documented Business Requirements using Word/HP Mercury QC.
Gathered requirements for project deliverables including technical documents, Functional Requirements Document, Test Analysis Report & User Guides/Manuals.
Developed Use Cases using MS Visio and Word.
Used VBA to format the data in MS Excel.
Responsible for coordination between stakeholders and the development team.
Assessed Data Quality and worked as a part of team to analyze the data dictionary.
Converted business requirements to data requirements.
Used TOAD tool to create SQL queries for data analysis and data manipulation.
Connected Oracle database using ODBC connections.
Generated and manually edited tnsnames.ora files using Net Configuration Assistant.
Worked with different data sources like Access, Excel, and data in text.
Documented Process and Product Quality Assurance (PPQA) in ClearQuest.
Developed data mapping and analyzed the data for data integration.
Created Cognos cubes using Transformer to support reporting requirements.
Develop necessary reports using Cognos8 BI Reports.
Reported testing using Cognos Report Net.
Help QA team with Functional/Application Testing.
Held periodic meetings with the stakeholders to update on the project status.
Responsible for process quality assurance and defect tracking in HP Mercury QC.
Responsible for project management documentation and generating the invoices upon the deliverables.
E D U C A T I O N & C E R T I F I C A T I O N S
BS, Physics and Math, Delhi University, Delhi, India
LL. B - Delhi University, Delhi, India