Post Job Free
Sign in

Systems Analyst Business

Location:
Texas City, TX
Salary:
60 to 65$/hr. on c2c
Posted:
May 06, 2025

Contact this candidate

Resume:

Swathi A

Sr. Business Systems Analyst

Email: ************@*****.***

Phone: 201-***-****

PROFESSIONAL SKILLS

●9+ years of experience in the Finance and Investment Banking domains, including Portfolio Management, Risk Assessment Using AML, KYC compliances, PCI-DSS payment method and other services such as Metadata management of human services for different State Agency’s. I've had an opportunity to work in both traditional and agile development life cycle environments (SDLC), including Agile-Scrum, Scrum-Kanban, Waterfall, & Waterfall-Scrum, and collaborated with self-organizing & cross-functional teams.

●Developed a skilled understanding in the financial services industry with a ensured focus on regulatory reporting and compliance frameworks such as Know Your Customer(KYC), Anti-Money Laundering(AML), General Data Protection Regulation (GDPR), Payment Card Industry Data Security Standard (PCI-DSS) and other financial crime prevention technologies such as Machine Learning (ML) to identify suspicious patterns and behavior.

●Understanding financial instruments including equities, derivatives, bonds, Securities, Exchange traded funds, trading life cycles, Risk Scores, Suspicious Activity, Fraud Detection Algorithms, Anomaly detection on scores.

●Engineered in analytical thinking, problem-solving, strategic thinking, active listing, stakeholder management, leadership, Technical Writing, Requirement Elicitation Analysis and with outranging Presentation skills.

●Proven ability to contribute to Project Deliverables such as the Project Plan, the Project Charter, Project Schedule, Risk Register, the High-level Project Scope, Work breakdown structure (WBS), the Statement of Work (SOW) and the Proof of Concept, stakeholder register, lessons learned document, closure document.

●Skilled with elicitation procedures like Prototyping, Questionnaires, Surveys, Focus Groups, Brainstorming, Requirement re-engineering, Reverse engineering, facilitating Workshops and JAD sessions with stakeholders.

●Familiar with performing As-Is, To-Be, GAP Analysis, as well as Process Flows, Root Cause Analysis, Market Research, SWOT Analysis, PESTLE Analysis, Risk Analysis, MOST analysis, Cost-Benefit Analysis, interface analysis and Feasibility Reports by identifying existing technologies and documenting the enhancements.

●Responsible for handling Defects and Change requests, implementing changes effectively, creating clear project schedules, performing requirement analysis, risk analysis, scope analysis, impact analysis, KPI and Performance tracking and being able to manage triple constraints namely Scope, Time, and Budget.

●A proficient ability to document the elicited requirements within the framework of the Business Requirement Document (BRD), the Functional Specification Document (FSD), Project planning document, the Requirement Traceability Matrix (RTM), Use cases, User Acceptance Criteria (UAT) and the System Requirement Specifications (SRS), Detailed Design Document(DDD), User Interface (UI) specifications.

●Well acquainted with workflows and Unified Modeling Language (UML) diagrams including Structural Diagrams, Object Diagram, Component Diagram, Composite Structure Diagram, Timing Diagram, Use Cases, Activity Charts, Class Diagram, Sequence Diagrams, Data Flow Diagrams and ER Diagrams using MS Visio.

●Detailed working knowledge of Agile SCRUM Framework and its Ceremonies like Backlog Grooming, Release Planning, Sprint Planning, Daily Stand-ups, Sprint Review & Sprint Retrospective Meetings. Participated in Agile train events including PI Planning, Scrum of Scrums, System Demo and Inspect & Adapt meetings.

●Robust experience in estimation techniques such as planning poker, t-shirt size, relative mass-evaluation, bucket system, dot-voting, Story Mapping, Risk-based estimation, Use Case Points (UCP) and affinity mapping.

●Experience in vertical slicing, breaking epics down into user stories as well as creating Scrum artifacts, such as Product backlog, Sprint backlog, and Potentially shippable Product Increment(PSPI).

●Expertise in writing User Stories and their Acceptance and Evaluation Criteria. Assisted Product Owner in Backlog Management, Prioritization, Workshops, Definition of Ready (DOR) and Definition of Done (DOD), INVEST (Independent, Negotiable, Valuable, Estimable, Small, Testable) criteria for user stories.

●Accomplished SQL queries of different types such as Data Definition Language(DDL), Data Manipulation Language (DML),Data Control Language (DCL), Transactional Control Language (TCL), Functions, Stored Procedures, Indexes, Views to perform Data Analysis and wide range of Data Transformations.

●Knowledge in the creation of Relational Database Management Systems (RDBMS), Data Modeling, SQL, Normalisation(1NF, 2NF, 3NF, Boyce-codd), Database and Metadata Management, Data Integration, Data Warehouses, Data Marts, Data Lakes, Operational Data Stores (ODS), Indexing and Query Optimization.

●Experience in defining Conceptual and Logical Data Models and creating Entity-Relationship Diagrams (ER diagrams). Also, implemented Dimensional Data Models by creating Schema models, such as Star Schema.

●Proficiency in creating and documenting Data Profiling and Cleansing, Data Dictionary, and Data Mapping Specifications, Source-to-target mapping (STM) to set up the ETL(Extract, Transform and Load) pipeline.

●Collaborated in utilizing Amazon Web Services (AWS) and Google cloud platforms (GCP) for ETL (Extract, Transform, Load) operations historical data into the data warehouse. Ensured meticulous validation of data accuracy and consistency throughout the ETL process while contributing to seamless data integration to maintain high performance, scalability, and reliability across data pipelines and support for analytics.

●Well versed with Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) including ROLAP, MOLAP, HOLAP. Performed OLAP operations like Roll-up, Drill-down, Slicing, Dicing to extract meaningful insights and create Pivot tables. Database Schema like Star Schema and Snowflake Schema using Facts tables and dimension tables to support complex analytical queries and enhance data retrieval efficiency.

●Crafted detailed API testing using POSTMAN to check API interactions using CRUD operations, Outlining precise endpoints with data in JSON, XML format using REST standards and documented using SWAGGER tool.

●In the Software Testing Life Cycle, assisted the Testers (QA) with User Acceptance Testing (UAT), Regression, Black-box, Performance, load, smoke and sanity testing and creation and maintenance of artifacts such as Test Plans, Test Scenarios, Test Cases, and Defect Logs with the help of Requirements Traceability Matrix (RTM).

●Expertise in Business Intelligence and in creating Static Reports, Ad-hoc and Interactive Reports, Dashboards using visualization tools such as Tableau, Power BI and Google Analytics to achieve organizational goals.

●Utilized JIRA for the maintenance and tracking of issues, bugs, user stories, workflow management, Work In Progress (WIP) limit of tasks, tracked teams progress using Product Burndown, Release Burnup charts.

Certifications:

Scrum Master Accredited Certification from Scrum Institute.org

SKILLS AND TOOLS

SDLC Methodologies: Waterfall, Agile (Scrum), Scrum-Waterfall(Hybrid), Agile(SAFe), Scrumban, Kanban

Business Modeling/Project Tools: MS Visio, MS Office 365 Suite-Certified Expert (Word, PowerPoint, Excel), SharePoint, Confluence, JIRA, Selenium, Tableau, Balsamiq, Lucidchart, Balsamiq.

Database: RDBMS Oracle, MS Access, MySQL, SQL SERVER, NoSQL

Programming Languages: SQL, UML, XML, HTML, JSON, Python

Defect Tracking Tools: HP Quality Center, JIRA, HP ALM, Confluence, Rally Board, Service Now

API Tools: POSTMAN, SWAGGER, SOAP, RESTful

Data Analysis/Defect Management Tools: Tableau, Power BI, MS Excel, Google Analytics

Cloud Services: AWS (Amazon S3, Amazon Redshift, AWS Lambda, AWS API Gateway), Google Cloud Services (Apache Kafka, Apache Spark, Apache Hadoop)

Operating Systems: Windows XP, Vista, 7, 8, 10, Mac OS X

PROFESSIONAL EXPERIENCE

CapitalOne, Plano, Tx Oct 2023- Present

Sr. Business Systems Analyst

Designed and delivered an automated system that monitors check-in account balances for customers in the USA and UK to generate tailored investment recommendations when thresholds are met. The system facilitates customer interactions by handling consent and providing reminders for declined recommendations. Once accepted, the system seamlessly redirects customers to the trading platform for investment management and Portfolio tracking, ensuring compliance with financial regulations in both regions.

Roles & Responsibilities:

●To acquire requirements for Portfolio Management, requirements were gathered using survey/questionnaire, interview sessions, and focus groups and the feasibility of the requirements was analyzed. Working with engineers, programmers, and end users to guarantee the conditional logic for sending recommendations when threshold is met.

●Evaluation of Porter's Five Forces, cost analysis, impact analysis, and feasibility assessment. Interacted one-on-one with Subject Matter Experts (SMEs), asking thorough questions, and meticulously documenting the requirements in a way that could be understood and approved by both technical and business professionals.

●Worked closely with the Business stakeholders and PO for work prioritization, defining the short term and long-term roadmap. Identified the MVP/MRF features from market research and stakeholder interviews.

●An assessment of AS-IS about the current business processes and TO-BE processes was conducted in order to understand the key findings, GAP analysis, areas of improvement, opportunities for change with short-term and long-term considerations within business and created documents such as BRD, FRD, SRS, Project Charter.

●Maintained and kept track of Requirement Traceability Matrix (RTM) to ensure all the Functional requirements are addressed in the use case and created Business Process Model and Notation (BPMN) Diagram.

●Participated in relative value analysis and assessed stakeholder goals to develop the feature release roadmap timelines, milestones, vision, risks, Critical Path Analysis (CPA), Work Breakdown Structure (WBS), Statement of work (SOW), Financial Analysis Models (NPV) and Return on Investment (ROI).

●Facilitated and time-boxed the Agile SCRUM ceremonies Sprint Planning Meeting, Daily Stand-ups, Sprint Review Meeting, Sprint Retrospective Meeting and Grooming Sessions (high-level to low-level requirements).

●Responsible for ensuring User Stories meet the INVEST Criteria and tasks meet SMART (Specific, Measurable, Achievable, Relevant, Time-bound) Criteria by working closely with the Product Owner( PO) .

●The PO to Prioritize the Product Backlog items using MoSCoW approach, assisted the Scrum team by defining User Acceptance Criteria (UAT) in estimating story points by Planning Poker to prepare Product Backlog Items (PBI’s) for developing Sprint tasks by taking suggestions from cross-functional development teams.

●Created wireframes, Graphical user interface (GUI), Mock-ups using Lucid Charts and created prototypes using Balsamiq to make the team understand the details of the project and its end-to-end system.

●Generated ad hoc SQL queries in AWS Redshift using joins, Subqueries, database connections and transformation rules to fetch data from the SQL Server database systems to optimizing query performance, scalability, and system efficiency, enabling seamless data extraction, transformation, and analysis for downstream applications and reporting.

●Developed complex SQL queries to extract and analyze portfolio data from multiple sources, providing insights into portfolio performance and risk exposure. Implemented SQL triggers to automatically update portfolio risk metrics and notify stakeholders of significant changes, ensuring real-time monitoring and responsiveness.

●Extensive experience and used Amazon Web Services components such as Amazon Simple Storage Service (S3), Amazon Redshift to store customers data, AWS Lambda functions for seamless computing and integration with each addition of new data to Amazon S3 (Simple Storage Service) with minimal infrastructure ahead.

●Coordinated with the development team to maintain Know Your Customer (KYC) compliance by enhancing user authentication by OAuth 2.0 protocols and risk management within regulatory standards.

●AWS API Gateway for creating secure, scalable APIs to facilitate smooth data exchange between our cloud-based solutions. Configured API endpoints for smooth and reliable interactions between modern applications and legacy infrastructure, ensuring secure data transfer and enhancing overall system scalability and reliability.

●Assisted development team in API testing using POSTMAN to check API interactions using CRUD Operations with data in JSON, XML format using REST standards and documented using SWAGGER tool.

●Assisted in creating sprint reports (Burndown/Burnup Charts, information radiators) using JIRA. Experience in creating dashboards on Power BI and tableau for decision-making and strategic analysis by the businesses.

Environment: Agile Scrum, JIRA v8.13, Confluence, MS Project, MS Excel, Lucid charts, Balsamiq, AWS (Amazon S3, Amazon Redshift, AWS Lambda, AWS API Gateway), SQL Server, RESTful APIs, SWAGGER, POSTMAN.

Department of Human Services, MDThink, Linthicum, MD Jan 2023 - Sept 2023

Business Analyst

The scope leverages AWS cloud infrastructure to gather, validate, and manage demographic data from state agencies, enhancing the efficiency of Maryland’s human services. It centralizes user applications, eligibility determination, case management, and benefit enrollment. Users can access and manage their program information through a single interface, improving service coordination and efficiency. The platform also supports data sharing among state agencies, facilitating better case tracking and timely support for residents.

Roles & Responsibilities:

●Conducted multiple closed questionnaires to experts in the field and the associated stakeholders to clear out doubts, conducted structured and unstructured interviews when required, regular JAD sessions are also conducted.

●Analyzed prior documents like Business requirement document (BRD), Functional requirement document (FRD), System requirement specification (SRS) and their organizational structure to deeply understand the project.

●Deep understanding of the requirement gathering process and maintaining documentation. Additionally, the project status, milestones and deliverables were communicated clearly and consistently to the internal IT team, and all stakeholders, to keep on check with project’s progress.

●Collaborated with the internal senior management (Technical SMEs, Architect, Business SMEs and Business heads) to clarify and draft important business artifacts like Business Requirement Document, Proof Of Concept.

●Liaised and communicated with various project stakeholders, including project managers, architects, developers, and quality analysts to define the future roadmap for the release of product features.

●Collaborated with the Scrum team in designing the prototype and sent to regional stakeholders for approval. Created low fidelity and high-fidelity prototypes like Mockups and Wireframes to analyze the user interface.

●Used MS Visio for Process modeling, Process mapping and Business Process flow diagrams for visual representation for data management and further transparency and efficiency across state agency systems.

●Helped the PO in prioritizing the Product Backlog, grooming User Stories in planning activities and maintained Scrum Boards to track execution status of different tasks, defects, Backlog items committed to the sprint.

●Participated in various scrum ceremonies including Sprint Planning Meeting, Daily Stand-ups, Sprint Review Meetings, Sprint Retrospective Meeting, and Backlog Grooming to ultimately achieve our release goal.

●Involved in prioritizing and estimating Product Backlog Items with Product Owner using KANO & MoSCoW technique and assisted Scrum team to pull User Stories in Sprint Backlog using Planning Poker.

●Led by the integration of data using AWS Glue for ETL processing of data, the larger datasets were stored in AWS Athena, and Informatica for effective data profiling, cleansing, and deduplication to support human services..

●Assisted Database Engineers in increasing Data Integrity by performing Data Normalisation, Data Verification and Validity to maintain the accuracy, consistency, and effectiveness of the data for business decision making.

●Constructed SQL queries and conducted Data Mining to find correlations and identify patterns in existing data. Assisted the back-end team in creating an Application Program Interface (API) to fetch data from the Redshift Enterprise Data Warehouse which supported SQL queries on real- time data monitoring .

●Utilized POSTMAN to perform API testing to validate the request and response for JSON based RESTful methods like GET/POST to capture the response error codes and data received in the body.

●Extensively worked in the Software Test Life Cycle phases from requirement analysis to test cycle closure ensuring that products are built with quality meeting user requirements. Wrote test cases, test scripts and test scenarios in a manner meeting user requirements for both the manual and automated testing and rigorous black-box testing.

●Designed and facilitated test scenarios, test cases and test scripts for functional and non-functional testing and logged the defects obtained using the JIRA tool. Tracked the defect lifecycle to minimize average bug age.

Environment: Agile- Scrum, JIRA v8.13, MS visio, SQL, POSTMAN, SWAGGER, AMAZON WEB SERVICES - AWS Glue, AWS Athena, AWS Redshift.

Webster Bank, Chennai, India June 2019- July 2022

Business Analyst

The PCI DSS-compliant In-House Payment System is integrated with KYC, AML, and GDPR compliances. The process includes cleansing and validating customer data, tokenizing it, and securely sending it to third-party vendors through API endpoints. It shall develop periodic reports that drive financial decisions such as investment recommendations, theft detection through fraud activity monitoring, and risk assessments are made to support the strategic operational objectives of the Bank.

Roles & Responsibilities:

●Gained understanding of compliances like Payment Card Industry Data Security Standard, Know Your Customer, Anti-Money Laundering, and General Data Protection Regulation. Gained more profound knowledge and experience in the financial services regulatory landscape, including strict data protection, anti-fraud measures, and identification of a person with due diligence of end-users and regulatory authorities.

●Consecutive one-on-one interviews and JAD sessions were performed with Data Mart identification. Strong interaction with cross-functional teams, inclusive of business stakeholders and data architects, was carried out in capturing requirements of defining specific Data Marts for various business units.

●Participated in Conducting in periodic AML risk-based reviews of individual and business accounts. Executed risk ratings on high-risk jurisdictions, business sectors, and Politically Exposed Persons (PEP) accounts. These reviews have supported the objectives of compliance and risk management, enhancing operational security.

●Attended and led meetings, technical discussions, reviews, and release planning. Communicated actively with the team and stakeholders regarding project status, technical roadblocks, and informed decision-making. Supported collaborative discussions that promoted alignment between development priorities and business objectives.

●Assisted PO in writing user stories, defining use cases, and maintaining the product backlog. Ensured that clear acceptance criteria were added, approval workflows were managed, and story progression was tracked within JIRA. Working on backlog prioritization, defect tracking, impediment resolution, and monitoring burndown charts to make sure that every sprint cycle had timely and quality deliverables.

●Conducted Agile Sprint Retrospective techniques such as Silent Writing, Mad/Sad/Glad, and Happiness Histogram were successfully done. The techniques were put into place to get feedback from team members and, based on the feedback, identify areas of improvement. Continuous improvement culture was developed, and this helped in improving team morale since the concerns were listened to and successes enjoyed.

●Wrote complex SQL queries using AWS tools and executed them for various types of analyses. Conducted data analysis, validation, and manipulation using advanced SQL operations such as join, union, group by, aggregate functions. Prepared reports with actionable insights that would portray data-driven decision-making processes.

●Prepared data mapping documentation identifying source data, target data, and transformation logic on collaborating with architects. Coordinated with development teams, business users, SMEs, and project managers and documentation enabled data integration, consistency, and alignment with the requirements of the project.

●Assisted in the development of an OLAP cube in support of data integrity, normalization, and quality validation. Advanced OLAP operations were enabled, such as drilldowns, roll-ups, slicing, dicing, and pivoting for detailed insight into efficient analysis which later transforms data into meaningful information for business analytics.

●Performed API testing using POSTMAN by testing the interactions an API would have via CRUD requests. Ran tests for data in JSON and XML formats while ensuring that REST standards were met. API behaviors and their interactions were documented using Swagger to achieve high reliability in the performance of APIs consistently.

●Utilized JIRA extensively in the creation and management of sprint and product backlogs, tracking issues, tracing requirements, and team collaboration. Effectively used JIRA capability in managing project workflow and always had visibility into sprint progress to track issues and their resolutions in a timely manner.

●Proficient in using JIRA which ensured project due transparency and accountability at every stage of the project and able to track burn-up-burndown charts, Known WIP limit of the sprint.

Environment: Scrum-Kanban, JIRA v8.13, Confluence, Lucid Chart, MS Access, Python(Pandas and Numpy), SWAGGER,POSTMAN, REST standards, Amazon Web Services (S3 and Redshift), SQL, Tableau, Google Analytics.

Razorpay, Bangalore, India Sept 2017 - May 2019

Business Systems Analyst

The functional scope encompasses a solution that shifted from a rules-based Anti Money Laundering (AML) to a behavioral analytics framework using Machine Learning (ML) for improved risk assessment. The system analyzed historical transactional data such as geography, transaction type and spending patterns of the customer to generate risk scores and flag suspicious activities. Derived classifications such as False positives were fed back into the system for continuous learning, while true positives triggered high-priority alerts to meet FinCEN compliance standards.

Roles & Responsibilities:

●Optimized risk management by utilizing ML algorithms, which reduced 30% of manual processing time and increased high-risk alert identification by 15%. The approach will further improve the efficiency of the system in flagging probably fraudulent transactions, and reduce manual workloads by automating alert prioritization.

●Acted as the bridge between the Product Manager and Scrum delivery team for alignment on project goals and management of stakeholder expectations. I ensured translation and interpretation of the needs of the stakeholders to ensure delivery by the team to meet the business objectives.

●The AS-IS and TO-BE were performed to carve out areas of improvement in the current system, thus performing GAP analysis to call out the redundant fields, additional target fields, and transformation logic.

●Key activities will be creating and maintaining business process documentation, including Business Requirements Document (BRD) and Functional Requirement Document (FRD), SRS, Project Charter, Product vision Document and process maps to better understand the projects and align with the business objectives.

●Conducted requirements clarification meetings with Subject Matter Experts (SME’s) by running multiple sessions to clarify business questions and ambiguities of the development team.

●Wrote user stories with detailed acceptance criteria for each, ensuring that a story meets certain business requirements and functional goals. Proactively managed dependencies with the backend team .

●The refinement of the backlog was led through meetings to review upcoming sprint work and ensure the team was well-prepared for forthcoming tasks and optimized sprint efficiency .

●Participated in Agile Scrum Review Demo Meetings were led at the end of each sprint with stakeholders to show completed features and gather feedback to help in continuous improvement. It helps ensure transparency and allows for the making of necessary adjustments according to the input provided by the stakeholders.

●Regular sync-up meetings with the PO at a frequency of three per week, to understand any requirement updates or clarification on the project at hand. Such regular communication was helpful to keep the team aligned with evolving project objectives while issues were being sorted out with urgency.

●Leveraged Apache Kafka to perform large-scale data ingestion, using Apache Spark and Hadoop to process historical transaction data in a distributed and horizontally scalable environment, integration supported efficient data flow and transformation crucial for handling complex datasets used in risk scoring.

●Cleaning and transforming data into analysis-ready, quality datasets using Pandas and PySpark for Machine Learning risk-scoring models. Feature engineering was applied to find suspicious activities .

●Assisted the technical team in fine-tuning the models of risk score using Scikit-Learn and TensorFlow, thus enhancing the accuracy of the model in identifying and classifying transaction risks. These models were very instrumental in the identification of fraud patterns with very high precision and reliability.

●Extensive API testing with POSTMAN, including testing of GET, POST, PUT, and PATCH methods at endpoints, was done and documented in SharePoint. This testing ensures APIs are secure, reliable, and function properly to maintain consistency of data on the platform.

●Coordinated compliance teams via JIRA and ServiceNow-managed true positive alerts and maintained compliance to regulatory standards. The setup allows for easy communication and quick fix of flagged issues.

●The System Process Maps were created and documented both Functional and Non-Functional Requirements, then translated into user stories on JIRA. SharePoint is used as a central repository to make sure information is easily available for reference to all team members and stakeholders.

Environment: Agile Scrum, Rally Board, JIRA, SharePoint, ServiceNow, Apache Kafka, Apache Spark/ Hadoop, Python (Pandas, PySpark, Scikit-Learn, Tensorflow), POSTMAN.

Dcube Data Services, Hyderabad, India Jan 2015 - July 2017

Data Analyst

The scope of cleansing, validation, and staging of SAP data, with further enrichment of data in order to reveal insights. These insights would be visualized into dashboards for better identification and monitoring of suspicious activities.

Roles & Responsibilities:

●Responsible for gathering data requirements from Business Analysts and performed exploratory data analysis and Identifying the data sources/patterns required for the requests.

●Collaborated with cross-functional Agile teams to understand project requirements & deliver data-driven solutions.

●Spearheaded fraud detection initiatives, using advanced ML models to identify suspicious transactions, fraud transactions resulting in a 25% reduction which later reduced human errors in predicting.

●Extracted, Analyzed and Presented Data in a Graphical Format using PowerBI as a part of Data Visualization.

●Ability to work with Complex datasets and also integrated SAP ERP data into analytical platform Experienced in using statistical methods like Regression models to predict, random forest, decision trees .

●Developed various SQL and Python scripts as part of the automation and reduced manual effort by 70%.

●Proficient in importing/exporting large amounts of data from Excel files to SQL server and vice versa.

●Developed a predictive model / algorithms using Azure and statistical analysis to meet customer demand.

●Worked on monitoring and data governance to evaluate the data quality & check for the secure, reliable data.

Environment: Agile Scrum, JIRA, MS office, SQLServer, PowerBI, Python (Pandas, Scikit-learn, Tensorflow), Google Analytics.



Contact this candidate