JEYASEELAN JAYAPAL
E-mail: **********.*******@*****.*** Mobile: +1-201-***-**** LinkedIn: Profile
Professional Summary:
Solutions oriented professional with progressive experience in Software Quality Assurance, Test Management, Test Automation, API, ETL, Performance and Web UI in a largescale Digital, Cloud & Data transformation within a fast-paced Agile/Scrum delivery model.
Gained extensive domain knowledge in Banking and Insurance, CRMS tools, Sales, Deals, AML-KYC, Salesforce, MS Dynamics, Global Payments, Wire, Fedwire, ACH, SWIFT, FIX Protocol, Client Reference Data, Client Onboarding and General Insurance, P&C Insurance, Guidewire Policy Center, Rating, Pega Claims.
Key Skills, testing tools & techniques
Test Automation - Java, Selenium, Cucumber BDD, Junit, TestNG, UFT, TOSCA
API - Java, MQ, Kafka, Microservice Contract testing, TIBCO, Postman, Parasoft SOAtest
Performance Testing - JMeter, LoadRunner, RPT, Sitescope, Dynatrace, Splunk
ETL/ELT - Db2, Oracle, SQL Server, PostgreSQL, Teradata, SQL, HQL, HIVE, Azure SQL DB, Snowflake DB, AWS Redshift, GCP Big Query, DWH and Data Lake.
BI - MicroStrategy Reports, Cognos and Power BI Tableau Reports
Other tools- ALM, JIRA, ServiceNow ITSM, IRM, qTest, Rally, Bitbucket, Jenkins, Ansible.
Experience Summary:
QA / Test Management Experience:
10+ years of experience in QA Lead / End to End Testing, Test Process, Test Automation and Performance, ETL and API testing in the enterprise level strategic Programs.
Consulting services leading workshops, assessing testing maturity and identifying opportunities to build a strategy for implementation of full-scale testing programs.
Experience sizing testing efforts and managing projects to a budget and timeline.
Hands on experience as a Test Lead working with developers, product specialists, business analysts, and project managers in Agile and waterfall methodologies.
Worked with Scrum team and resolving blocker issues.
Track and maintain JIRA EPIC/Release/Stroy/Task/Sub-Task tickets.
Managed Business Partners and Business user in UAT Testing and Prod Cutover testing.
Define and implement Test Strategy, Test Plan, and Automation framework to ensure high standard of testing and quality is in place.
Execute and evaluate software quality assurance across SDLC by conducting System Testing, Regression, Integration and End to End Testing, Isolate, replicate, and report quality assurance defects and verify defect fixes. Conduct defect review and publish status report.
Automation & Functional Testing Experience:
7+ years of experience in Automation using Java and Selenium, Cucumber BDD framework.
Implemented Java based Automation development in-house Hybrid RAFT framework for Web UI, Mainframe Applications, API and ETL Testing, Utilized BDD, Data Driven Test (DDT), Page Object Model and Page Factory design patterns.
Developed Automation test scripts in Java, Selenium Web Driver, Cucumber Gerkins, TestNG, Junit, SQL languages.
Executed APT test scripts and logged Bug in JIRA for resolution.
Built customized HTML reports for Step level and Test summary.
Experienced in CI-CD integration with Github/Bitbucket and Jenkins.
API / Middleware MQ Testing Experience:
7+ Years of experience in Middleware Integration Testing using tools like Postman, ParaSoft SOAtest, Kafka, IBM MQ, Mule Any point, APPWatch and Java-Based Automation framework.
Extensive Experience in SWIFT Messages MT102, MT103, EXEC983, FLM, XML, Pain001. Pacs008 file formats, FTP, Wire, ACH Payments, FedWire, FIX Protocol, FIXML, Trig, RDF, Web services, Restful services, Microservice Contract testing, JSON, XML and other formats.
Prepared test cases, Test messages, files based on the feature requirement specification.
Implemented Automation framework development in Java, Cucumber BDD, Data Driven Test (DDT) methodologies for API, MQ, RESTFULL services testing.
Executed API test scripts and logged Bug in JIRA for resolution.
ETL-BIG Data Testing Experience:
Around 7 Years of experience in ETL and ELT Process, Big data Hadoop Testing, HDFS, Hive, Sqoop, HBase and RDBMs Oracle 11g, Netezza, Teradata, DB2, SQL Server and BI reporting MSTR, Cognos and Power BI Tableau
Perform ETL and ELT data validation testing on Data Lake, Data warehouse applications on-premises and cloud
Data migration integrity validation between legacy databases and cloud AWS, Microsoft Azure SQL Database and Hadoop Bigdata.
Exposure to cloud data platforms GCP Big query, Azure synapsis.
Utilized RAFT Automation framework built on Java, Cucumber TDD/BDD technique for ETL and Data validation.
Proficient in writing complex SQL Queries and validate ETL/ELT Test scenarios.
Validation of Data Integrity Check, Quality, Uniqueness, Structure and Field Mapping
Validated Aggregation, FACT and DIM tables business logic within DWH tables.
Executed ETL test cases and logged defects for data discrepancies
Executed informatica and Auto-Sys Jobs for data refresh in LLEs.
MSTR, Cognos and Tableau Report data validation with Databases.
Performance Testing Experience:
5+ years of experience in performance Testing, utilized JMeter, LoadRunner and RPT.
Design, develop and execute system performance test, Load testing, Stress testing, Scalability testing, Spike testing, Volume testing, Endurance/Soak testing.
Create Performance Test scripts in JMeter for Web UI, API and DB calls.
Experience identifying memory leakage, connection issues, throughput bottlenecks in web application, infrastructure, and Cloud.
Monitoring using APM tools and publishing performance metrics such as resource data, CPU, network and database utilization, Garbage collection.
Strong Knowledge in Monitoring Tools CA Wily Introscope, Sitescope, Dynatrace, Splunk.
Certified professional in HP Load Runner, IBM Rational Performance Tester.
Career Profile:
Role
Organization
Location
Date
QA Automation Lead
(APIs, Web UI and Database)
Bank of America (Client)
Mitchell Martin Inc.
Newyork City, USA
Oct 2021 – Till Date
QA Lead / E2E Test Lead
Automation/Performance/ETL
(APIs, ETL and Web UI)
Bank of America (Client)/
AIG (Client)
Tata Consultancy Services
Newyork City, USA /
Chennai, India
Dec 2012 – Oct 2021
Performance Test Consultant
(Web-UI, APIs, and DB)
Bank of America (Client)
Tata Consultancy Services
Chennai, India
Dec 2011 – Nov 2012
Test Automation Engineer
(Functional Web UI & APIs)
Bank of America (Client)
Tata Consultancy Services
Chennai, India
Feb 2011 – Dec 2011
Test Lead & UAT Test Coordinator (AMLKYC)
Bank of America (Client)
Tata Consultancy Services
Chicago, USA
Apr 2009 – Feb 2011
Test Engineer (ETL & BI)
Bank of America (Client)
Tata Consultancy Services
Chennai, India
Oct 2006 - Mar 2009
Educational Qualifications
Degree and Date
Institute
Major and Specialization
Master Of Philosophy in Computer science - 2006
Bharathiyar University, Coimbatore, TN – INDIA
Computer Science
Master Of science in
Computer science - 2004
Vysya College,
Periyar University, Salem, TN – INDIA
Computer Science
Details of Assignments
Project Title # 1
GMAR - Client Reference Data, New York City
Client
Bank of America, Mitchell Martin Inc.
Role
QA Automation Lead
Duration
Oct 2021 – Till Date
Description
Client Reference Data:
Client Reference data platform consists of suite of applications, such as client REST APIs, Refservice APIs, Content Management APIs, SPARQL, Web UI cesium and Client Onboarding Workflows, AMLKYC profile, AMLKYC questioner, AMLKYC Restrictions and closures and Database GMAR applications LATR, Account AML reports.
1.APIs - Client REST API, Refservice APIs are built on RDF/TRIG formats to publish and validate party into cesium and allow other upstream or downstream applications to access and publish parties, AMLKYC and Accounting related data within bank.
2.Web UI - Cesium UI is used for publishing and tracking newly onboarding clients, Onboarding workflows and their AMLKYC reference data.
3.Database – GMAR DB Application consist of AML Regulatory Reports, LATR, Account AML and reconciliation report. LATR feeds data daily from GM data and validates AML regulatory compliance and Account AML feeds data from Cesium and validates the party AML complicate.
Automation scope is to adopt and customize in-house java-based automation framework RAFT, Litmus Test and develop automation scripts for APIs, Web UI workflows and Database Validation functionalities and perform functional and regression testing.
Responsibilities
•Understand and implement Bank’s inhouse automation platform RAFT Java-Based platform and execution tool Litmus Test.
•Collaborate with RAFT support team and adhere with the changes introduced.
•Participated in a daily scrum standup call, sprint planning and retros.
•Worked with scrum team and resolving blocker issues.
•Track and maintain JIRA EPIC/Release/Stroy/Task/Sub-Task tickets.
•Define and implement Test Strategy, Planning and enhance framework to ensure high standard of testing and quality is in place for the in-scope applications.
•Prepare Test cases, test data RDF/TRIG files based on the feature requirement specification for Clent REST API and Refservice applications.
•Prepare stubbed test data for Microservice Contract testing and executed in Pact
•Create Cucumber feature files using BDD Gherkins.
•Develop test scripts in Java/Selenium webdriver for client onboarding UI workflows.
•Prepare Performance Test scripts in JMeter for API and DB calls.
•Schedule and execute Load, Stress and endurance test and share the performance metrics with Development and Project Team.
•Prepare complex SQL for ETL data validation and BI Tableau reports.
•Create ETL test automation scripts in Java, Cucumber BDD test methodology
•Executing and checking the script to ensure bug free scripts deliverables.
•Check in and Check out Automation codes in Bitbucket.
•Automation test execution and report Defects in JIRA
•Participate daily Agile scrum meeting and update JIRA tickets
•Schedule Litmus test for daily Health Check, Sanity and Regression testing.
•Perform release coordination in ROTA by developing Ansible playbook scripts.
Tools
Java, Selenium Webdriver, Cucumber BDD, Postman, JIRA, RDF/TRIG, JMeter, Bitbucket,
Ansible, DB2, SPARQL, Tableau
Project Title # 2
Portugal SAFT Invoice Regulatory and Stamp Duty Monthly Reporting
Client
American International Group (AIG)
Role
QA Automation Lead / End to End Test Lead
Duration
Sep 2019 – Sep 2021
Description
Portugal SAF-T Invoice Regulatory:
Portugal was the first country to follow OECD recommendation to adopt in 2019 the Standard Audit File for Tax (SAF-T) file. The Portugal SAF-T file has followed a phased development process with increasing requirements for companies, Guidewire Policy Center implementations.
The Portuguese Tax Authority (PTA) has stipulated that a certified invoicing system/software be used to produce invoices and relevant tax documents and have the capability to extract a SAF-T Invoicing file reported to the PTA on a monthly base. Such system certification is performed by the PTA in a controlled process.
Portugal SDMR:
This Project requires new process flow for generating Stamp duty monthly report in Portugal. Till date SD amount collected are report to PTA (Portugal tax Authority) in total, from Jan 2021 the stamp duty to be reported per Tax ID per SD code / type of transaction per region. Following Iberia system are in scope for this project.
1)Portugal GOALD Premium data extract.
2)Portugal DM Web Premium data extract.
3)New ETL job(s) on extraction, data loading, and consolidation and XML generation.
Responsibilities
•Test process Governance for all the QA Programs to adhere to the Testing Factory stand standards in agile methodology.
•Facilitated daily Scrum standup calls, Sprint Planning and retros.
•Experience in supporting Guidewire Policy Center with experience on latest GWPC version for US P&C Insurance and Pega Claims Centre.
•Worked with Scrum team and resolving blocker issues.
•Track and maintain JIRA EPIC/Release/Stroy/Task/Sub-Task tickets.
•Report scrum burndown charts, Productivity and Backlogs.
•Define and implement Test Strategy, Planning and framework for testing.
•Analysis feature requirements specification
•Develop Test script Cucumber BDD feature files and create java step definitions
•ETL test automation using Java, Cucumber BDD test methodology
•Prepare complex SQL for ETL data validation and BI Tableau.
•Prepare Performance Test scripts in JMeter for API and DB calls.
•Schedule and execute Load, Stress and endurance test and share the performance metrics with Development and Project Team.
•Monitor the test using APM tools Dynatrace.
•Worked with ServiceNow ITSM and resolved the ITSM tickets.
•Maintained Integrated Risk Management compliance with ServiceNow IRM.
•Prepare and publish Performance test report and key metrics.
•Maintained metrics dashboard at the end of a phase or at the completion of project.
Tools
Java, Selenium Webdriver, Cucumber BDD, JIRA, HP ALM/QC, Sybase 15/16, ETL Data
stage, Github, MPP, VB.net, Power BI Tableau, AWS, Azure SQL Database.
Project Title # 3
Treasury Authorized data Provisioning (TRADs), Hadoop Migration
Client
Bank of America
Role
QA Automation Lead / End to End Test Lead
Duration
Sep 2018 – Aug 2019
Description
Treasury Authorized data Provisioning (TRADs)Phase 1:
Data migration from all the SOR’s of TRADs Provisioning to staging maintaining data quality, which is required for reverse data lineage of RDA Assessment.
Phase 2:
Migration of Data Provisioning to Big Data using Hadoop, Hive, Sqoop and Spark by creating a Unified Credit model (data model) to provide the structured data to the downstream.
Responsibilities
Test process Governance for the QA Programs to adhere to the Testing Factory stand standards in Agile methodology.
Ensuring the changes to the templates is published to the entire team.
Participating in Test Plan/Test Specifications/Test Scripts Review meetings.
Develop selenium/java script using RAFT Framework
Develop Automation test scripts in Java, Cucumber for data Validation
Prepare Complex SQL, HQL for data validation.
Data Integrity, Data Quality, Data Uniqueness, Structure and Field validation.
Data cleanliness, scrubbing and masking check
Data Validation between HDFS Vs RDBMS by running Sqoop Jobs.
SQL Aggregation, FACT and DIM table validation in DWH.
Represent QA in daily scrum calls with Scrum Master, Product, Dev/QA teams to discuss issues and action plans
Prepares / updates the Test completion Report at the end of Test phase,
Tools
Java, Selenium Webdriver, Cucumber BDD, JIRA, HP ALM/QC, Big Data, Hadoop, Spark QL,
Katka, HDFS, bitbucket, Azure SQL Database, Azure Data Studio, GCP, Snowflake DB
Project Title #4
Middleware Technologies (SWIFT Mandates, ACH Rhino, Dynamic Currency conversion, OFAC Scanning, GFD, GTMS & Other Apps, Payments Platforms)
Client
GBAM - Bank of America, USA
Role
End-to-End Test Lead / QA Lead, Middleware Data Services
Duration
Nov 2012 – Aug 2019
Location
Newyork City, NY, USA
Description
SWIFT Network will update and/or implement new message types in response to regulatory and AML requirements for every year November release. SWIFT is the Society for Worldwide Interbank Financial Telecommunication. More than 10,000 banking organizations, securities institutions and corporate customers in 212 countries use these SWIFT formats daily to exchange millions of standardized financial messages.
To support these changes, Bank of America must make changes globally every November on the wire payments systems to accommodate any new messages or updates to existing messages
Responsibilities
•Define and implement Test Strategy, Planning and framework to ensure high standard of testing and quality is in place
•Analysing and reviewing the Request Processor mapping documents and Imitative Integrated Matrix (IIM), XSD, Copybook
•Extensive experience in handling and preparing test Payment file formats SWIFT MT102, MT103, EXEC983, FLM, ISO Pain.001, Pacs.008, XML, XSD, FTP, Wire, ACH Payments, Fedwire, FTR Informatica Data eXchange, RDF and Trig files.
•Validating and tracking the services using bank internal RPST/RPIST tools
•Validation and verification of XML tags, Message payloads fields
•Prepare stubbed test data for Microservice contract testing and executed in Pact.
•Created simulator for Microservices for contract version and data validation.
•Prepare Kafka messages and validate them for producer and consumer Pub/Sub.
•Develop Automation test scripts in BDD framework, Cucumber for Service Validation
•Create cucumber feature files, Data port and YAML file Set up
•Validating in the downstream Mainframe application
•Check point Review of SOA Test scripts in Parasoft SOAtool for MQ services and HTTPS services for regression testing
Tools
IBM DataPower, IIB, Mule ESB, DB2, Mainframe, Oracle, Java, Cucumber, Parasoft SOATest, Quality Center, XML Spy, SSH Tectia, Notepad++, AppWatch, PathWAI, Azure SQL Database, Azure Data Studio.
Project Title #5
CED Infrastructure & RAC Implementation
Client
GWBT– Bank of America, Chicago USA
Role
Performance Test Lead
Duration
Apr 2009 – Nov 2012
Description
RAC Project addresses the current gap in the Banks Client Data Management (CED) environment of not supporting a 24x7 global operation. CED is the System of Record for GCIB Legal Entity information supporting client regulatory and reporting requirements for BAC. Due this RAC Implementations CED will have a stated SLA of 24x7 target uptime, and the necessary supporting processes will satisfy a 24x7 operational model, or at minimum a 24x6 operational model for helpdesk functions to support a Global user base introduced by the Merrill Lynch portfolio. The scope of this initiative will be to convert the current database instances from single instance to Real Application Cluster (RAC) database instances.
Responsibilities
Defining Performance Test Strategy, Planning and scheduling required level of Load testing
Project Estimation and Fund allocation for the team members
Managed team schedules, milestones, and deliverables using internal tools for all technical issues related to scripting and configuration
Designed and Developed Load Runner scripts based on the most performance centric workflows
Tracked daily progress and reported any issues and proposed resolutions
Created Executive summary reports for project stake holders
Participated in Performance Tuning Analysis with Engineering team
Performance Monitoring using CA Wily Introscope, Dynatrace, HP SiteScope, Splunk tools
Reviewed Oracle AWR reports and discussed performance tuning with DBA's
Tools
Load Runner, Rational Performance Tester, HP Quality Centre Oracle 10G, SSH Tectia Client
Project Title # 6
Rational Rollout – Bank of America, USA
Role
Performance Engineer
Duration
Jan 2008 – Apr 2009
Description
Rational Roll out effort consists of conducting Research and Proof of Concept on Rational 2007 Tools such as RFT, RPT, RST and Rational Clear Quest. IBM Signed off on the Proof of Concept conducted. All Automation, Performance and SOA Assets were converted from HP Mercury to Rational tool set.
Responsibilities
Building out the Rational Test Environment
Proof Of Concept and Sign Off on Rational Tool Set
Design and analysis Automation and Performance Use Cases
Reviewing the Automation and Performance Use Cases with the Project Stake Holders
Developing Automation and Performance Scripts using Rational Tool Set
Reviewing the Automation and Performance Scripts with the Project Stake Holders
Executing the Automation and Performance Scripts and sharing the test Results with Project Stake Holders
Tools
HP Load Runner, Rational Performance Tester, Rational Services Tester, Rational Clear Quest, HP Quality Center
Project Title # 7
1. UB Profitability Consolidation
2. LaSalle Bank Conversion / Bank of America, USA
Role
ETL Test Engineer
Duration
Oct 2006 – Jan 2008
Description
BOA acquisition of La Salle bank there is a need to migrate relevant La Salle Clients and Prospects to the existing CED database. Conversion of all required data elements will Provide foundation for the conversion from the La Salle tool suite to the CRMS CIBR tool suite and support other CRMS efforts and applications. LaSalle Client data are extracted from MS Access Data base as a Tilde-Delimited Text File and are loaded into staging table. Only success and Exception records are moved to CED and all the result actions should have an entry in result table. We have migrated the Client MI data, Market Manager Data, Sales Logic data, Guarantor data, and Eagle Leasing data during this project.
UB Profitability Consolidation and its primary purpose is to provide a single source of profitability data to Commercial and GCIB profitability tools in Universal Bank. To implement these changes, work effort will be broken up into various work packets. Each work pacts are spited up based on the financial data and Account details; PDB and PRF tables are implemented to consolidate the profitability data and moved into ACCUMULATOR tables.
Responsibilities
Analysis and Understanding of the design documents HLD, LLD and mapping documents
Preparation of test cases using Oracle SQL queries
Executing the SQL Test cases to validate Data Integrity and data consistency
Executing the SQL Test cases to validate the transformation logics
Executing the SQL Test cases to verify the records migrated accurately
Logging the defects in Quality Center and publish status reports
Tools
Oracle 10g, Teradata, PL SQL Developer, HP Quality Center
PERSONAL DETAILS
Sex
Male
Nationality
Indian
Marital Status
Married
Current Location
New Jersey, USA
Contact Number
Visa
H1B, I-140 Approved – Priority Date Apr-2018