VARALAKSHMI ADIPUDI
Mobile : 703-***-****; Email : *********@*****.***
PROFILE SUMMARY:
Over 15 years of IT experience specializing in Business, Systems, and Data Analysis.
Comprehensive knowledge of Software Development Life Cycle, Waterfall and Agile Scrum methodologies, and COTS implementation.
Extensive expertise in gathering business and functional user requirements, creating use cases, designing diagrams (Activity, Class, Sequence), developing Business Requirements Documents (BRD), and developing Data Distribution Requirement (DDR).
Proficient in data management, data collection, and identifying functional data requirements with in-depth understanding of data models.
Skilled in RDBMS, including SQL scripting for data analysis and generating reports.
Familiar with Data Visualization tools like Microsoft Power BI and Tableau.
Strong process analysis experience, including conducting Gap Analysis, documenting AS-IS/TO-BE processes, and creating process flows and models using Visio.
Experienced in facilitating Joint Application Development (JAD) sessions with end users, expert teams, QA and UAT teams, and stakeholders for project-related discussions.
Developed and executed test scenarios and use cases to support development teams and business groups.
Reviewed test scenarios and cases for UAT and regression testing to ensure pre- and post-implementation quality of applications and production issue resolution alongside QA and business groups.
Proficient in analyzing test results, presenting application quality levels, and offering suggestions to improve performance and eliminate bottlenecks.
Highly organized, goal-oriented, and self-driven professional with the ability to quickly master new technologies and manage multiple tasks from initiation to completion with minimal supervision.
Demonstrated expertise in data classification, company data retention policies, and access control frameworks.
A motivated and resourceful team player with a positive approach to problem-solving.
Excellent communication skills, ensuring effective facilitation of requirements and collaboration between clients and project teams.
Proven ability to collaborate effectively within cross-functional teams across organizations
Mentored business analysts to onboard and excel in project requirements.
TECHNICAL SKILLS
Databases: Oracle 9i/10g/11g/16/19.22, SQL Server and MS Access, Snowflake
Version Control: Rational Clear Case, CVS
Defect Tracking Tools: PVCS Tracker, Rational Clear Quest, JIRA, Quality Center 10, Rally
Others: TOAD, Microsoft 365, Visio, DOORS, SharePoint, Confluence, Collibra, Salesforce, Docway, AWS Cloud, Python
Methodologies: SDLC, Waterfall, AGILE SCRUM
Data Visualization Tools: Microsoft Power BI, Tableau
EDUCATION:
Masters in Economics
Certificate:
Databricks Generative AI Foundational Certified
Oracle Cloud Infrastructure 2025 Certified AI Foundations Associate
AI Security & Governance Certified
PROFESSIONAL EXPERIENCE
V3 Tech Solutions Inc., July 2025 to Present
Client: Santander’s Bank
Working as a Senior Data Architect III supporting Santander Bank’s Chief of Staff Team for the AML OneFCC Customer Batch Screening initiative. Responsible for gathering requirements, validating upstream data sources, and designing the data architecture to consolidate customer information from multiple systems into the OneFCC Data Warehouse.
Key Responsibilities
Gathered business requirements for the AML OneFCC Customer Batch Screening project and translated them into technical specifications.
Analyzed customer data from multiple upstream systems to support screening, validation, and consolidation into OneFCC warehouse tables.
Built detailed data lineage documentation by identifying upstream mappings, validating data flows, and ensuring traceability across systems.
Created data scenarios to simulate endtoend data movement and ensure alignment with AML screening requirements.
Developed Data Definition Requirements (DDR) for multiple data domains and collaborated with modelers to support logical and physical data modeling.
Designed and delivered SourcetoTarget Mapping (STM) documents to guide ETL/ELT development.
Supported developers by clarifying data rules, transformation logic, and integration scenarios.
Designed and developed Power BI dashboards to monitor customer batch screening volumes, data quality exceptions, and upstream source discrepancies.
Built semantic data models in Power BI to support AML operational reporting and leadership insights.
Created DAX measures for screening metrics, exception counts, SLA tracking, and data validation KPIs.
Integrated Power BI with Snowflake to enable realtime analytics on customer screening workflows.
Implemented RLS (RowLevel Security) to ensure restricted access to sensitive AML data.
Partnered with AML business users to define reporting requirements and deliver selfservice analytics.
Optimized Power BI datasets, queries, and refresh schedules to support highvolume screening data.
Environment: Snowflake, Power BI, SQL
Fannie Mae October 2018 to March 2025
Role: Lead Associate
Served as a Lead Business/Data Analyst for Fannie Mae’s Multifamily Data Program Team. Responsibilities included gathering requirements from product stakeholders and downstream consumers, as well as consolidating mortgage data from various source systems into the Data Warehouse including migration of the data from different source systems. This work was conducted as part of the Target State Initiative Project and encompassed Pipeline, Acquisition, and Servicing data.
Worked as a Lead Analyst for the following projects.
Acquisition Modernization: CF Commitment: As part of the Acquisition Modernization Project Phase 1, Credit Facility Commitment Data was integrated into the Multifamily Target State system from DUSG. Developed in Salesforce, DUSG delivered the data via XML. This enhancement enables Multifamily Target State downstream consumers to access Credit Facility Commitment Data in structured tables, eliminating the reliance on paper-based formats.
Hedge: Integrating Loan Hedge data from a third-party vendor's legacy systems into the Multifamily Target State system through a single consolidated source. This integration encompasses both Initial Hedges and Replacement Hedges.
LIHTC: LIHTC (Low-Income Housing Tax Credit) Equity data is sourced from a third-party vendor and integrated into the Multifamily Target State system.
Key Responsibilities & Achievements:
Collaborated with product stakeholders to get the project objectives.
Gathered business requirements from product stakeholders.
Consolidated and documented attributes from upstream systems into the target state.
Collaborated with the Architecture team to design the data flow.
Ensured mandatory enterprise documentation for cloud-based Data Warehouse tables, including DSET submissions at all stages from source files to target tables.
Updated the Stage model with new stage tables to capture source data.
Worked with modelers to design and implement new tables and attributes, enabling seamless integration into the target state.
Conducted data gap analysis to integrate new source data with existing tables and other sources.
Prepared comprehensive data requirement and specification documents, including table load rules, to meet enterprise standards.
Created Source-to-Target Mapping documents aligned with enterprise data requirements and objectives.
Authored Business Requirement Documents (BRD) detailing source and target systems, filter conditions, and Data Quality (DQ) rules for compliance.
Reviewed data mapping documents and table specifications with the development team.
Participated in quarterly Program Increment Planning (PIP) sessions as part of Agile ceremonies to discuss and prioritize the backlog into sprint stories.
Engaged in backlog grooming to create sprint user stories for developers and testers, ensuring they included detailed descriptions and acceptance criteria.
Contributed to daily stand-ups to address project hurdles and took part in monthly retrospectives to reflect on and improve processes.
Built detailed data lineage, identifying upstream data mappings and validating end-to-end data flows.
Collaborated with the upstream system team to create test data scenarios, generating mock data to validate data flow against business requirements.
Validated XML against XSD to ensure schema versions and data flow alignment.
Walked through business scenarios with product owners using sample data to secure final approvals.
Participated in User Acceptance Testing (UAT) with business users to ensure the data completeness and accuracy.
Assisted business users in creating test cases for Business Shakeout in lower environments.
Supported Business Shakeout Testing in production to validate system integration and data accuracy.
Environment: Oracle 19.22, TOAD, MMR, ERStudio, Jira, Confluence, Sharepoint, Excel, Word, PowerPoint, Visio, AWS Cloud Database.
V3 Tech Solutions Inc., June 2017 to October 2018
Client: Fannie Mae
Worked as a Senior Business/Data Analyst for Fannie Mae’s Multifamily Data Management Team. Responsibilities include validating and drafting requirements for consolidating mortgage data from various source systems into the Data Warehouse as part of the Target State Initiative Project.
Responsibilities:
Building the Business Glossary while working as a domain steward.
Building the data lineage by identifying the upstream data mapping and validating the data flows.
Creating data scenarios to make the data flow through each scenario to meet the requirement.
Involved in defining the source to target data mappings, business rules and data definitions.
Environment: Oracle 11i R2, TOAD, MMR
Aikya Inc December 2016 to June 2017
Client: Freddie Mac
Worked as a Sr. Metadata Analyst for Freddie Mac Single Family Data Governance Team. Consolidating the required Metadata from different applications into Informatica Analyst and Informatica Metadata Manager template.
Responsibilities:
Consolidating the metadata from different applications (from BRSs) and fixing the definitions and Business term names to upload into Informatica Analyst.
Consolidating the metadata manager template for different applications to upload into Informatica metadata manager.
Work with business team and get approval for the consolidated Business terms with definitions.
Upload the approved Business terms and Metadata Manager information in Informatica and link with physical table and columns.
V3 Tech Solutions Inc., March 2014 to December 2016
Client: Fannie Mae
Worked as a Sr. Business/Data Analyst for Fannie Mae’s Multifamily Data Management Team. Projects include DUS Disclosure - Consolidating the required data from different source systems as per CREFC standards to publish Reports; Loan Accounting Initiative (LAI): Consolidating Loan accounting data from different source systems; Target State Initiative (Master Servicing) Project: Consolidating Master Servicing Legacy System data into Data Warehouse.
Responsibilities:
Participated in the JAD sessions and project status meetings with the stakeholders.
Lead the collection, analysis, documentation and coordination of Stakeholders business requirements.
Created the Business Glossary while working as a domain steward for the Loan Subject Area.
Built Data Lineage by identifying the upstream mappings and validating the data flows.
Participated in the review of Business Process Flow diagram meetings to revise the process flows.
Created data scenarios to make the data flow through each scenario to meet the requirement.
Worked on the Gap Analysis of DUS Disclosure Project
Analyzed various upstream and downstream systems that provide the data to the existing Disclosure application and delivered impact analysis for the same.
Worked with internal architects and assisting in the development of current and target state enterprise data architectures for Meta Data and Master Data.
Involved with loading the Metadata Data into the MMR Informatica tool while enforcing the standards and guidelines.
Created FRD’s for the purposes of conversion while ensuring the legacy data is in line with Target State
Reviewed the data mappings and critical fields with DUS Disclosure Business Team and got the approval of mapping and data.
Participated in Functional Requirement Development (FRD) Review sessions with Third Party vendor as well as internal Business Team.
Documented the end user report templates and built data lineage from source system to CDS to downstream reports.
Worked with project team representatives to ensure that logical and physical data models were developed in line with corporate standards and guidelines.
Involved in defining the source to target data mappings, business rules and data definitions.
Responsible for defining the key identifiers for each mapping/interface.
Responsible for defining the functional requirement documents for each source to target interface
Environment: Oracle 11i R2, TOAD, DOORS, MS Visio, MS-Office, MS-Project, Salesforce
Client: Freddie Mac June 2013 to February 2014
Sr. Data/Reporting Analyst
Project: MHAC - Information Wall Reporting
Worked on the quarterly Information Wall Reports on the system security access.
Analyzed the data from various upstream and downstream security systems for Information wall reporting.
Prepared the SSIS package to load the data from different sources into SQL Server database using Microsoft Visual studio.
Executed the SSIS package to load the data into SQL server every quarter.
Performed data validation in the stage and permanent table’s data in SQL server database.
Worked on Data validation of the critical security access of Information wall reporting data writing SQL queries.
Prepared the Information Wall detailed and summary reports for access control.
Uploaded the Information Wall Reporting information in Clear Case.
Worked on other different reports analysis and validation monthly.
Environment: SQL Server 2008, Clear Case, MS Visual Studio, MS-Office
Client: Fannie Mae June 2011 to June 2013
Sr. Business/Data Analyst - III
Project: ODS Data Warehouse Project
Worked on the Gap Analysis between two large data warehouse of ODS and RDW Database
Worked on Data validation of the critical existing fields of ODS writing SQL queries and MS Access
Analyzed various upstream and downstream systems that provide data to the application and delivered impact analysis for the same.
Worked on the Data mappings for missing fields from source system to ODS to meet the business requirement.
Reviewed the data mappings and critical fields with business user team and got the approval
Created the PRs using Clear Quest as part of the requirement
Created the Stakeholder requirement for the missing fields and reviewed the data mappings and stakeholder requirements with Technology team.
Worked closely with developers in translating the business requirements to a high-level design document.
Worked with the Quality Assurance team in reviewing test plans and provided feedback
Wrote the Test Cases in Quality Center for the missing critical fields
Participated in UAT Testing along with Business users to make sure the completeness and correctness
Assisted business users to create business test case for UAT in Quality Center
Participated in Shakeout Testing in production as final validation
Facilitated weekly events with business and technical teams. These included Weekly project progress meetings, requirement analysis meeting etc.
Delivered and managed the requirements in RequisitePro.
Analyzed, coordinated and supported the Data Corrections work in various systems as part of Production support
Worked on Data Quality Check like completeness and correctness of the attributes
Environment: Oracle 11i R2, TOAD, Quality Center, Clear Quest, RequisitePro, MS Visio, MS-Office
Advanced Software Systems Inc., August 2009 to March 2011
Client: Department of Technology Services, Montgomery County, MD
Sr. Business Analyst
Project: Case Management System (CMS) for States Attorney’s Office & CRIMS Project
Analyzed business and system requirements and developed test strategy for testing the COTS application.
Coordinated Joint Application Development (JAD) sessions with users to analyze project application lifecycle; gathered and defined the key performance indicators; documented and presented the requirements specifications for business development.
Worked on Data Mapping document for the system
Worked on the Data Analysis and Interface between the Old System to New System
Execute the UNIX script to get the data file from Mainframe system and also upload the data into database.
Working on the High level testing of the interface Data.
Implemented Software Development Life Cycle (SDLC) to define the project life cycle of PDI project.
Identified constraints in the existing process and interacted with user community to develop and enhance the solutions to overcome constraints.
Defined PDI (Pre-Delivery Inspection) project milestones, schedules, and monitored progress using Microsoft Project and updated plans as required.
Used Microsoft Visio for flow-charting, process model and architectural design of the application.
Documented User Requirements using Microsoft Word and Quality Center.
Developed detailed Scenarios and Sequence Diagrams describing how actions to be carried out.
Facilitated weekly events with business and technical teams. These included Weekly project progress meeting, requirement gathering meeting, requirement analysis meeting etc.
Worked closely with developers in translating the business requirements to a high-level design document.
Worked with the Quality Assurance team in reviewing test plans and provided feedback and performed profitability analysis.
Created and managed departmental surveys, which helped in overall development of the department
Assisted in developing Microsoft Access database for Test Center technicians to record day-to-day activity that helped engineers to analyze number of failure parts and time taken by technicians to repair it and also to calculate each technician efficiency.
Prepared Business Use Cases according to the Business Processes for UAT
Involved and worked on Requirements Traceability Matrix and find out the GAPs between the requirements and System functionality.
Captured critical/core business transactions to verify and validate key components of applications in the automation tool for reuse.
Create and execute SQL queries to performed Data Integrity and Backend Testing using SQL queries to retrieve data and perform data manipulation.
Executed the UNIX Scripts to initiate the Backend Batch Jobs and verify their status.
2nd project CRIMS using Oracle Enterprise Version - Business Analysis and System requirements, this COTS product developing in Java and Oracle Forms.
Environment: COTS, .Net, XML, SQL Server 2005, Oracle 10g, Track, QC, Unix, MS Visio, MS-Office, MS-Project, Oracle Forms, Java
Medassurant Inc. May 2009 – August 2009
Sr. Business/QA Analyst II
Project: CCS Advantage
Analyzed business and system requirements and documented test requirements using quality Center, Doors.
Responsible for Interaction with Users to know their Business views while gathering the Report requirements and provided Several Report Mock-ups to finalize the requirements.
Developed Universe based on the requirements
Created Web Intelligence Ad-hoc and canned reports.
Created complex reports using multiple data providers and created parameterized reports.
Written SQL queries have been used on the Universe and created as Derived Tables.
Wrote Complex SQL Queries for Data validation purpose.
Tuned and Enhanced Universes with SQL Queries for the Report Performance.
Worked on Data Analysis of the Client Data into our system.
Captured critical/core business transactions to verify and validate key components of applications in the automation tool for reuse.
Environment: Toad, Oracle, MS-Visio, Quality Centre, Requisite pro