Professional Summary
●* years of experience as a Business Systems Analyst with an in depth knowledge of documenting business process flow diagrams, requirements traceability management (Waterfall and Agile), requirements documentation, back and front-end systems design and project management.
●Over 5 years of experience as a Senior QA Software Tester with a detailed understanding of business requirements, deployments and Software Development Life Cycle (SLDC).
●Expertise in Agile and Waterfall process with good knowledge of Testing Methodology in Testing for Both Front/Backend Testing.
●Ability to interact with business analyst to understand process, gap analysis, E2E application flow.
●Good Experience in Oracle, SAS, SQL, PL/SQL, logical and physical database design, backup and recovery procedures.
●Knowledge of testing strategy to set up data, test environment/access, LOE and project resource allocation.
●Strong analytical skills and experience with implementation and administration of Software Quality Assurance metrics and measurements against entry/exit criteria.
●Has good knowledge of Test Plan development, business requirement gap analysis, and review test plan and impact analysis for each release.
●Possess well rounded testing skills with a firm understanding of Functional end-to-end testing, System testing, Batch job processing, User Acceptance Testing (UAT), and Regression testing.
●Experience working with current web based technologies Solid Oracle/SQL skills to validate data and generate test condition in test cases.
●Ability to provide effective communication regarding issues, objectives, initiatives and performance measurements against set plan.
●Good experience in managing and working with distributed team members (onshore/offshore) Experience.
●Worked in production to support release night team in testing, manage production data, test execution and managed critical issues.
●Industry experience in the analysis, design, development, systems integration, testing, reporting and implementation of application software.
●Excellent written and verbal communication skills including experience in proposals and presentations.
●Strong communicator and creative problem-solver, skilled in analyzing business requirements and transforming them into technical designs, applying technical skills and business knowledge to achieve sound results.
●Well-organized, quick learner, self-motivated, team player and achiever.
Technical Skills
RDBMS: Oracle 8.x, 9.x, 10.x, 11.x, 12.x, SQL Server, MS Access
Utilities: SQL*Loader, Export/Import, Data Pump, RAC, RMAN
Version Control System: Rational Clear Case, Team Track, Sharepoint
Bug Tracking: Clear Quest, Team Track, JIRA, Quality Center
Tools: OEM, TOAD, Erwin, ETL, MS Visio 2003, Wiki,
Confluence, Clarity, Microsoft Project, SQL developer, SOAPUI, Cognos Developer
Programming Languages: SAS, SQL, PL/SQL, C, C++, Excel VBA, Shell Scripting
Operating Systems: UNIX (Linux, Sun Solaris), Windows 98/2000/NT/XP /7
Education
●BS Engineering, Embry Riddle University, Daytona Beach, FL
Professional Experience
Cenlar FSB, NJ June ’17 - Current
Technical Analyst
True technical analysis spans all aspects for Cenlar: more in the weeds, someone who can get thrown into things and manage at the granular level and do the digging and analysis with data, process, write requirements; tech analyst is responsible for process flows and technical flows, ask the detailed questions and help document the current state process and articulate the future state flows; detail oriented
Responsibilities:
Support projects involving Cenlar’s core servicing business areas, translating business requirements and functional specifications to technical specifications and requirements.
• Assist the Business in refining their requirements by asking questions, engaging in conversations, documenting workflows, and using other communication means to ensure alignment and understanding of business objectives to the technical needs and solutions.
• Work with Vendors to implement third party software.
• Develop various options to solve business problems supported by factual information for each option.
• Mitigate risk and exposure of solutions by ensuring proper controls are well integrated into the project/solution design.
• Prepare highly detailed workflows, configuration drawings, system reference charts, and documentation to aid in the understanding of current and future state processes and systems.
• Perform data mapping and data element translations between source and target systems.
• Prepare technical requirements for all required reporting.
• Define operational elements of projects and solutions, such as file runtime and frequency, file delivery schedules, holiday scheduling, data profiling parameters, and other checks and balances.
• Prepare support documentation for use post-deployment of the project/solution.
• Work to ensure technical areas have complete requirements in order to be effective and efficient.
• Ensure project delivery plans are complete and represent the technical deliverables properly, ensuring Technology’s ability to achieve their overall milestones and meet deadlines.
• Ensure operational considerations are implemented to ensure proper handling of data files and report data.
• Implement projects conforming to strategic tools, leveraging solutions already implemented.
• Highlight additional areas of improvement while executing project work, and make recommendations for solving those deficiencies.
• Follow PMO and Technology project delivery standards and methodology.
• Prepare and submit status reports at predefined frequency. Act as central source for communications from Technology area; ensure status, progress, issues and resolutions are properly communicated upward and outward.
• Prepare test cases and scenarios using all existing documents from the PMO and Business, while also adding technical components required for testing, that the Business would not need to be involved.
• Socialize future state configurations for acceptance by the Business, Technology Managers
• Work within policy and procedures to ensure effective development and implementation of project related components.
• Consult with Technology Managers on any unknowns and validate assumptions; immediately communicate any gaps in requirements, potential risks and issues and work appropriately toward closure.
• Act in an advisory and intermediary capacity during development phase, to broker all issues and inconsistencies a Developer may uncover; work with Business Operations and Technology teams to properly solution and/or address the issue in a mutually agreeable manner.
• Participate in design sessions, highlighting areas requiring greater focus, adding Business Operations understanding to the technical teams, and for overall understanding of the solution.
J.P. Morgan Chase, DE May ‘14 – December ‘16
Business Systems Analyst
Customer Exposure Management System Project – Business Systems Analyst/Data Lead
Responsibilities:
Create/document business requirements and software requirement specifications documents and perform gap analysis to identify variant between customer operation and business requirement logic.
Document high level and low level business use cases to provide IT counterparts an understanding of current business processes for developing and implementing the proper application refinements to meet business needs.
Create/document Business Requirements Documents and Impact Matrices for business QA team to provide the best possible LOE based on requirements.
Create and establish standards, guidelines and processes that improve system and reporting efficiencies.
Creating a data inventory management system with the ability to track data lineage and usability
Basel Committee for Banking Supervision Project – Business Systems Analyst
In January 2013, the Basel Committee on Banking Supervision (BCBS) issued the Principles for effective risk data aggregation and risk reporting to strengthen bank's risk data aggregation capabilities and internal risk reporting practices. The Principles include Overarching Governance and infrastructure, Risk data aggregation capabilities, Risk reporting practices, Supervisory review, tools and cooperation.
A Data Aggregation/Integration work stream project has been initiated to comply with the Regulatory and Basel requirements. This project will document and understand the current state of data aggregation and define technology requirements to ensure all required elements used in critical reporting are migrated to the ICDW environment.
Responsibilities:
●Establish tracking, monitoring and reporting mechanisms to outline progress check points.
●Analysis of current reporting inventory and reverse engineering current reporting
●User Acceptance Testing on multiple Delivery Platforms
●Prepare Test Plan and Test Strategy
●Defect Analysis and Report Generation
●System and Functional Testing
●Test Case Creation
●Involved in different major and minor release testing by updating test plan with testing methodology, reviewing test results, analyzing defects and resolving them with the team
●Create in depth data gap analysis and data mapping between legacy report UPD’s to current BCD’s
●Identify potential risks and pitfalls with current processes in place and manage deliverables and expectations without compromising timelines
●Created traceability matrix between requirements and test cases using Quality Center. Provide impact test analysis on application-function changes due to revisions in requirements and/or user requests.
●Assisted in creating, streamlining and implementing firm wide adoption process.
●Interacted with System Analyst to analyze business requirements, gap analysis, E2E flow of application and involved in impact analysis for changes in business rules.
●Developed SQL Script (Developed complicated SQL Script in Teradata) to validate data and report metrics.
●Trained new hires to speed up in assign project by sharing project level information.
●Provide subject matter expertise on the use of source systems data as well as educate team on data, metadata and standards
●Accurately document system changes and follow procedures to ensure system integrity
●Recommend and establish standards, guidelines and processes that improve system and reporting efficiencies
●Performed multiple team roles (team member, team co-leader) in the development and implementation of key initiatives
Verizon Wireless, NJ _ May ‘11 – ‘April 14
Involved in system and UAT testing of store locator management application for managing store hours, special hours, administrative information and other functionality. Application testing occurred in Front End application (JAVA) and backend testing in UNIX on Oracle database. Application has capability to generate report and testing is validated in SQL scripts.
Responsibilities:
●Analyzed business requirements, gap analysis, focused customer experience and design and developed QA Test Plan/Test cases using Use Case, Test Scenario by maintaining E2E flow of application.
●Applied various testing methodology in test cases to cover all business scenario for quality coverage.
●Baseline QA test plan, communicate SDLC team, developed testing strategy, establish LOE of project.
●Interact with development team to understand design flow, code review, discuss unit test plan.
●Created traceability matrix between requirements and test cases using Quality Center. Provide impact test analysis on application-function changes due to revisions in requirements and/or user requests.
●Set up testing environment, maintain release note, set up test data and plan for acceptance testing of code delivery by sanity testing.
●Work on automation for back-end processes Create/Develop scripts to generate billing records/XML requests by writing scripts in Java/Shell instead manually typing the records/requests.
●Automated key process with Unix Shell Script in regression testing to minimize testing execution.
●Defect root cause analysis, track defect in Quality Center, manage defect by follow up open items, and retest defects with regression testing.
●Testing tasks included integration testing external team and UAT business team to plan, execute identify defects, managed defects, reports and implement business process.
●Proactively managed/troubleshoot problems and often make decisions independently that impact overall system and projects.
●Understand the settlement and business processes and to provide technology solutions that can benefit the product and vendor implementations.
●Set up training environment and data for in house Ambler Bank User information for each release training based on user guide. Participated in release demo with training user. Collected technical and business feedback with team.
●Interact with clients to provide testing for integrated product and operated ad-hoc and day to day business requests.
●Standardize QA process by establishing standard QA documents, procedure to enhance daily operation.
●Contribute QA team to improve QA process by technical feedback in key learning session after each major release notes.
●Support QA upper management by maintaining Test execution & defect report and communicate critical blocking issue.
●Trained new hires to speed up in assign project by sharing project level information.
Bank of America, NJ Dec ‘09 – ‘Apr 11
Senior QA Software Tester
Involved in Testing of Consumer and Business Billing Customer Payment Automated billing information system via GUI with Real-Time transaction in API call in XML request/response with back end is in Oracle database with HP UNIX. Involved in E2E Testing of application in system testing, documenting QA Test Plan/Test Cases, defect tracking in quality center, UAT Testing and release support.
Responsibilities:
●Interacted with System Analysts to analyze business requirements, provide gap analysis, end to end flow of application and involved in impact analysis for changes in business rules.
●Designed and developed QA Test Plan/Test Cases using Testing methodology with all possible Used Cases to cover all testable function in Positive & Negative Testing.
●Provided test metrics on the results of system testing activities; coordinated and collaborated with others in analyzing collected requirements to ensure system development and implementation plans meet customer needs and expectations.
●Confirmed and prioritized project plans and deliverables per customer requirements and deadlines; and stayed current on emerging tools, techniques and technologies related to software testing.
●Helped drive automation innovation (participated in testing tool analysis and/or reviews, assisted and/or trained other manual testers in automation, implemented recommendation to improve test automation).
●Developed SQL Script (Developed complicated SQL Script in Toad and Teradata) to validate data, automated UNIX based process using Unix Utilities with Shell Scripting and executed all test scenarios in minimal time.
●Set up Test Environment including refresh database from production, maintained test environment with upcoming release.
●Involved in key learning process after each release to implement process in future as a part of QA process Management, Kick Off Meetings and Release Management, Key learning Sessions etc.
●Interacted with team to resolve daily issues in installation, code delivery, documentation, and defects.
●Involved in testing of multiple applications with QA, SIT and UAT testing by documenting, reviewing, resolving defects and following up with interface sign-off.
●Supported Change Control by reviewing impact of all modules, planning all release schedules, preparing back out plan etc.
●Supported production support team in production Trouble Ticket to analyze root cause of problems and resolved them with the team.
●Involved in different major and minor release testing by updating test plan with testing methodology, reviewing test results, analyzing defects and resolving them with the team.