Chandra Kakaraparthi
Sr QA Lead Automation & Manual Testing Expert: API, UI, and Database Testing
Email: *************@*****.*** Phone: 208-***-****
LinkedIn: linkedin.com/in/chandra-sekhar-kakaraparthi-37b66220
Work Authorization: U.S. Permanent Resident
Professional Summary
Experienced IT professional with 19 years of progressive expertise in Quality Assurance, spanning investment, private, retail, fintech, and insurance sectors.
Senior QA Lead with a strong background in software testing, automation, and quality assurance methodologies.
Proven ability to design and implement robust testing strategies that enhance software reliability and ensure compliance with industry standards.
Adept at designing and executing comprehensive test strategies to ensure high-quality software delivery while leading QA teams, mentoring engineers, and fostering collaboration across cross-functional groups to optimize testing efficiency.
Expert in database and ETL testing, proficient in developing complex SQL queries utilizing advanced joins, temporary tables, and query optimization for validating newly built tables and views.
Skilled in data integrity validation, ETL pipeline verification, and backend testing to ensure seamless data transformation and processing.
Designed and implemented robust automation test scripts using Java programming language with Selenium WebDriver, leveraging frameworks such as Page Object Model (POM), Page Factory, and TestNG.
Expert in automation testing of RESTful APIs using Postman for request validation, scripting, and test collections, and Newman for command-line execution and integration into CI/CD pipelines
Proficient in Rest Assured with Java for robust API testing within automation frameworks.
Proficient in leveraging GitHub Copilot to accelerate automation script development, improve code quality, and enhance productivity in test automation frameworks.
Proficient in building and managing CI/CD pipelines using Jenkins and TeamCity, utilizing Git commands for version control, leveraging Maven and Gradle for automated builds, and integrating repository servers like GitHub for streamlined development workflows.
Deep business knowledge of financial instruments, including Equities, FX, Futures, Options on Futures, and other derivatives, with an understanding of front, middle, and back-office operations.
Hands-on experience in trading systems, covering Equities, Futures, Options, and Fixed Income across Trade Order/RFQ workflows, exchange connectivity, Order Management Systems (OMS), and post-trade processing.
Advanced knowledge of software development life cycle (SDLC) models including Agile Scrum, Kanban, and V-models. Well-versed in CMMI processes with deep expertise in tools like JIRA, Quality Center (ALM), and Version One for test and defect management.
Education
Master of Science in Electrical Engineering
University of Idaho, Idaho, USA (2002 – 2004)
Bachelor of Technology in Electronics and Communication Engineering
Nagarjuna University, India (1997 – 2001)
Technical Skills
Category
Skills
Automation Tools
Selenium, Tosca, Quick Test Professional, WinRunner, Postman, SOAPUI
Test Management Tools
Jira, qTest, Quality Center, ALM
Programming Languages
Java, C, C++, VBA, Python
Scripting Languages
Perl, Shell Scripting, JavaScript, HTML, VBScript, TSL
Operating Systems
Linux, UNIX (HP and Red Hat), Windows 2000/XP, Windows 7
Databases
Oracle, SQL Server, SQL Server Management Studio
Database Tools
Rapid SQL, TOAD, DB Artisan, Pentaho
Integrated Development Environments (IDE)
Eclipse, Visual Studio Code, IntelliJ, NetBeans
Source Code Editors
Visual Studio Code, Notepad++
Other Tools
Putty, Visual Studio, FileZilla, WinSCP, Github Copilot
Professional Experience
Fiserv – Berkeley Heights, NJ
Sr. QA Lead Aug 2022 – Present
Project: Card Console
Card Console is an integrated service platform enabling customer service representatives of financial institutions to manage debit and credit card services for their customers. The application interfaces with mainframe systems – Optis and Tandem – through a domain service layer via RESTful APIs for data processing and retrieval.
Roles & Responsibilities
Actively participated in sprint planning & refinement sessions with business analysts and development teams, ensuring efficient story breakdown and prioritization.
Designed and executed comprehensive test scripts aligning with acceptance criteria.
Provided daily testing updates in stand-up meetings to ensure visibility and progress tracking.
Conducted comprehensive UI testing of the React JS-based application and validated APIs using Postman.
Collaborated with domain service developers & QA teams to troubleshoot issues efficiently.
Automated web-based UI application using Cucumber BDD framework with Selenium WebDriver in Java for functional and regression testing.
Integrated automated tests into the development process using CI/CD pipelines (Azure DevOps)
Conducted root cause analysis to determine whether defects originated from domain services or Optis APIs.
Developed an extensive RESTful API collection to validate integration of the Card Console web application with domain service APIs, ensuring seamless communication between components.
Developed an automated API testing suite for RESTful API validation in Postman, ensuring rapid defect detection and stability across services.
Developed an automated API smoke testing suite using Python in Visual Studio Code for RESTful API validation, utilizing requests and pytest frameworks.
Integrated with CI/CD pipelines (Jenkins, Azure DevOps) for continuous execution and early defect detection.
Executed Postman collections through Newman for automated API testing in a CI/CD pipeline, enabling efficient validation of API responses.
Prioritized and maintained regression test suites, collaborating with business analysts for continuous testing.
Leveraged GitHub Copilot in Visual Studio Code to efficiently develop and maintain automation scripts for UI and API testing, significantly reducing manual coding effort.
Identified test data by exploring Optis and Tandem systems pre-sprint to ensure thorough test coverage.
Delivered Sprint demos, presenting validation results to product owners and stakeholders for acceptance.
Generated detailed test reports to support UAT and production deployments.
Partnered with business analysts and developers to integrate product owner feedback.
Prioritized and maintained regression test suites, collaborating with development and business teams for continuous testing.
Engaged in Sprint retrospectives, contributing insights for process improvement.
Technical Environment
ReactJS Java Spring Boot RESTful API Postman Newman Azure DevOps Mainframe Selenium BDD Cucumber qTest TestNG GitHub GitHub Copilot VS Code Python Pytest
Verisk Analytics - Jersey City, NJ
Sr. QA Lead Sep 2020 – August 2022
Project: Loss Cost Extraction
The goal of this project was to modernize and simplify the existing Loss Cost Extraction process by decommissioning the legacy mainframe-based system and implementing a robust ETL-based solution. This new approach introduced enhanced governance, auditability, and transparency in the loss cost data flow—from initial generation to final distribution. The downstream systems consume the output as IntegRater and Generate Pages files. This solution covered multiple Lines of Business (LOBs) including Commercial Auto, FARM, General Ledger, COMFAL, and Business Owner and Property (BOP).
Roles and Responsibilities
Reviewed legacy mainframe processes to understand the input/output structures for IntegRater and Generate Pages workflows.
Created comprehensive mapping documents by analyzing current state processes and mainframe specifications.
Defined and validated data filtering logic per LOB to ensure accurate API-based data extraction from Snowflake.
Performed data mapping validation by comparing outputs from IntegRater and GenPages with source JSON files.
Developed ETL jobs using Pentaho to extract data from Snowflake and transform it into CSV formats for downstream processing.
Validated RESTful API end points responsible for structured data retrieval from Snowflake, ensuring correct data filtering, transformation logic, and response format.
Ensured all output files adhered to business rules and formatting guidelines for Liability, Physical Damage, Auto, and Garage coverage types.
Developed and executed custom Python scripts to validate data accuracy, integrity, and structure of final output files, ensuring complete alignment with business specifications and downstream requirements.
Automated regression test cases using Python scripts, significantly enhancing test coverage, execution speed, and reliability across sprints.
Actively participated in daily stand-ups and walkthroughs, providing testing updates and resolving blockers with stakeholders.
Managed and prioritized test requirements in JIRA, coordinating closely with development teams for requirement clarifications.
Prepared and shared daily status reports to maintain alignment across QA, development, and business teams.
Collaborated with offshore QA teams for requirement reviews, test execution, and test data preparation.
Coordinated with DevOps and business teams to ensure accurate test data setup and environment readiness.
Tracked all test requirements, execution progress, and defects using JIRA, ensuring timely sprint delivery and compliance with Agile processes.
Technical Environment
Restful API Snowflake Pentaho ETL tool AWS S3 bucket Jira Python PyCharm Linux
IDB Bank - New York, NY
Sr. QA Lead Jan 2019 – Sep 2020
Project #1: Current Expected Credit Loss (CECL)
IDB Bank is implementing the FASB ASU No. 2016-13, credit impairment standard which is required as per the compliance. IDB should be reporting the credit losses expected on the loans that it has issued to its clients as per the compliance standard. IDB uses the external modelling tool EVOLV by the vendor Primatics to calculate CECL losses.
Roles and Responsibilities
Reviewed the CECL requirements in Jira and categorized them into multiple phases for testing
Developed a detailed test plan incorporating the test objective and approach, Jiras for each phase, test deliverables, defect management, roles and responsibilities, risks, issues and dependencies
Developed detailed test cases to cover the different aspects of CECL testing – data loading into the EDW (Enterprise Datawarehouse), data processing and lookup logic within the data warehouse and the export of the data in the feeds to EVOL
Developed Complex SQL Queries incorporating subqueries, built in functions and use of temporary tables to evaluate the input data and compare with the output
Led the offshore QA team in the execution of the test cases though multiple phases of testing
Prepared a detailed test execution and defect reports and explained the progress of testing to the business stakeholders in the daily testing calls
Extensively evaluated the output files sent by Evolv by reconciling the data with the data in the Enterprise data warehouse
Testing progress and the issue encountered are updated daily in Confluence and the project team notified daily
Technical Environment
SQL Server 2016, Mulesoft API middleware tool, Jira, Confluence, Discoveryhub, Primatics, Evolv
Project #2: Informatica sunset
As part of organizational initiative, Informatica is decommissioned and replaced with Discovery Hub ETL Tool and Mulesoft middleware tool for data transformation and loading into the downstream systems: Salesforce, Rateshub and RAPNAP. This involved set of interfaces which are moved to new ETL tool. The scope of the project is validation of data transformation as per new mapping specification and loading of the data into the downstream systems.
Roles and Responsibilities
Created an integrated test plan for validation of data maps and testing of downstream interfaces
Prepared detailed test cases to test each interface functionality in depth
Thoroughly validated the target tables & views to ensure that data populated is as per the mapping specification and the lookup operations are done correctly
Tested all the Core banking objects: Customer, Group, Group Party, Relationship Group, Relationship Group Member, Financial Account
Also involved in testing various alerts from core banking like Time Deposit Alerts, Overdrawn and Uncollected alerts, Account Status, Zero Balance, Kitting Suspects, Maturing Deposits, etc.
Developed Complex SQL Queries for testing using subqueries, built in functions and temporary tables to evaluate the data
Validated the full view and delta view functionality in the data warehouse to ensure correct records are picked up in the Discoveryhub batch job runs
Organized the defect meetings daily with the project and business team and worked with the development team in their quick resolution
Prepared the daily defect reports from Jira and classified them as per their functionality and severity for project & development managers
Testing progress and the issue encountered are updated daily in Confluence page and the project team notified daily
Technical Environment
AS400 Mainframe, SQL Server 2016, Mulesoft middleware, REST API clients -Postman, Discoveryhub, Jira, Tosca automation tool, Jira, Confluence, Discoveryhub, Salesforce, Rateshub
Project #3: Consolidated Periodic Statement
The goal of the Consolidated Periodic Statement (CPS) project is to develop a consolidated monthly statement for the IDB Bank’s customers showing their balances across different products within the bank ranging from Checking Accounts to Investment portfolio from Capital Markets. Data from the multiple source systems is harnessed into Enterprise Data warehouse based on Star schema and Operational Data Store structures. Statements are generated out of Data warehouse using XML as output using Mulesoft as integration platform which are sent for printing in pdf format to the external vendor
Roles and Responsibilities
Discussed the requirements with business stake holders and prepared a detailed test plan
Extensively tested the new data warehouse processes used for generation of statements
The processes included testing of ETL logic implemented using Discovery Hub ETL tool, validation of XML generated using XML SPY tool and validation of logic implemented by writing T-SQL queries in SQL Server
Also tested the APIs/Microservices generated using Mulesoft for generating various components within the report
Thoroughly tested UI wireframes and Mockups to ensure complete adherence to the standards laid out
Recorded and tracked all the defects across the project using Jira and was single point of contact for testing for the project
Conducted daily meetings with ETL and Mulesoft developments teams onshore for resolving defects and reported the status to Management
Conducted daily meetings with offshore team to ensure proper handshake of testing tasks
Extensively tested the apis generated by Mulesoft to ensure that all the transactions and data related to customers are present in the monthly statement
Testing progress and the issue encountered are updated daily in Confluence and the project team notified daily
Technical Environment
AS400 Mainframe, SQL Server 2016, Mulesoft Api, XML SPY, Discoveryhub, Tosca automation tool
Wells Fargo Regulatory Reporting - North Brunswick, NJ
Sr. QA Lead, Feb 2018 – Dec 2018
Project: MiFID-2 Regulatory reporting
The Markets in Financial Instruments Directive is an EU law that aims to regulate investment services across the European Union. Wells Fargo regulatory reporting platform 1STR is responsible for meeting the regulatory reporting requirements across multiple jurisdictions namely MIFID2, MIFIR, ESMA, CAD and HKMA. In this current role, I work closely with the QA team in the preparation and execution of test cases for testing the changes requests for the MIFID2 jurisdiction. Also, I work closely as the level 3 support for the middle and back-office users in resolving the issues occurring daily to meet the regulatory SLA's for MIFID2 reporting and in supporting their overall MIFID2 reporting activities.
Roles and Responsibilities
Reviewed new requirements and change requests captured in the Jiras to prepare test cases using the BDD methodology
Reviewed the test results after the system and integration test cycles with the QA team and signed off on the SIT testing
Ensured that the trades received from the multiple source systems Calypso, Catalyst and iReports are meeting the regulatory reporting SLA’s for the multiple asset classes – FX, Interest rates derivatives, Equity Cash and Fixed Income
Entered trades in calypso with various FX trades to validate the reporting process flows
Covered all the possible workflows in Calypso to validate the compliance reports
Extensively tested the Cling UI Pro (JavaScript based UI application) dashboard which shows the trades received and reported across the multiple jurisdictions
Assisted the business users with UAT testing after the SIT test phase and has facilitated the UAT signoff
Participated in the Jira intake and prioritization meetings with the business users
Provided the Level 3 support for the middle and back-office business users on a day-to-day basis to make sure that SLA’s have been met for the Post Trade Transparency and Transaction reporting
Prepared extensive and elaborate reports in excel showing the trades received and processed from the multiple source systems and their reporting statuses for the Post Trade Transparency and Transaction reporting on a daily and weekly basis
Provided the trend analysis of the issues encountered to the Dev team by analyzing the large volumes of data
Reconciled the trades submitted to the Unavista and Trade Echo by verifying the responses received from them
Followed up with the regulatory reporting mechanisms Unavista and Trade Echo for the issues encountered with the trades to ensure they are processed correctly
Technical Environment
MIFID2 Regulatory Reporting, Jira, Unavista, Trade Echo, BDD, Agile Scrum methodology, GIT, Maven, Selenium, J2EE, SQL Server, Squirrel, Cling UI Pro, JavaScript, Calypso, Catalyst and Fists, IReports, UNIX.
BNY Mellon - New York, NY
Sr. QA Lead, Mar 2017 – Jan 2018
Project: Markets in Financial Instruments Directive (MiFID 2)
The Markets in Financial Instruments Directive is an EU law that aims to regulate investment services across the EU by Jan 3rd, 2018. The impact of the regulations would be on the LOB’s including Global Markets, Capital Markets, Securities finance and Investment management. To be compliant with the regulation the reporting requirements – Transaction reporting, Post Trade transparency, Best Execution for FX need to be met before January 3rd 2018. All the FX Trading systems are to be updated or changed to me made MiFID compliant. The scope of this project is to test the individual systems and to perform end to end and integration testing of all the systems. As a QA Lead on the project, I am responsible for integration and end to end testing of the upstream systems (FXALL, Bloomberg, 360) and downstream system (UnaVista and Trade web).
Roles and Responsibilities
Reviewed the Business Requirements Document (BRD) and Business Functional Specification (BFS) to understand the scope of the project and to learn the functional behavior of the various systems.
Organized and led the BFS review meetings between the onsite business team and the offshore QA team to discuss the requirements and to get clarification.
Developed a detailed test strategy detailing the test approach, test scenarios for Transaction and Transparency reporting and test cases to cover the entire reporting functional requirements.
Reviewed the user guide for the reporting vendors – UnaVista and Trade web and gave a walkthrough of the UI to the entire QA team in offshore
Have set up QA team members with different profiles to access UnaVista and Trade web to login and validate the reporting of the transactions.
Developed the test scenarios which covered the various booking area and portfolios to ensure that jurisdiction-based business scenarios are covered.
Developed detailed test scenarios which covered the validations on the On Venue and Off Venue transactions and their reporting logic by the venues to the APA (Trade web) and the Systematic Internalizer tiebreaker logic.
Extensively tested the transactions originating from various upstream systems like FXALL, Bloomberg, 360T, CAML, Currenex, Logiscope, Citi Velocity, Wall Street Systems (WSS) and the reporting rules validation in the UnaVista and Trade web.
All the reporting rules of Approved Reporting Mechanisms – UnaVista and Trade web are thoroughly validated with all the transaction types – new, amendment, cancellation, fixing of NDF and exercising of the options.
The filtering logic of the Markets Hub application which filters out the transactions based on the product types Spots, Forwards, Swaps, NDF and Options is extensively validated.
Developed the mapping between the ARM Reporting fields and the output of the trading system WSS to explain the logic.
Assisted in preparation of the macro which validated the output of the ARM with the input of the ARM.
Prepared a traceability matrix between the test cases and the reporting requirements.
Organized daily test case status review calls with offshore and onsite development, business teams to discuss the defects and progress of the test execution.
Technical Environment
FX, Agile-Scrum process, JIRA, Reporting, UnaVista and Trade web, FXALL, Bloomberg, 360T, QTP.
Capital One Capital Markets - New York
Sr. QA Lead/Analyst April 2015 – February 2017
Project: Wall Street FX Solution
ION Wall Street FX product is being implemented as the front to back FX Foreign Currency Trade processing solution for Capital One Capital Markets FX Team. In the MVP1 phase, the initial configuration and functionality of the application is tested with 150+ currencies. In the MVP2 phase, the system is enhanced to transact FX transactions via the web portal, connect WSS to upstream liquidity providers, and connectivity to PIE to process payments. In the current MVP3 release, WSS FX is integrated with Bloomberg and FXALL via ION efx gateway and to the liquidity providers: Deutsche bank and Goldman Sachs.
Roles and Responsibilities
Discuss the business process workflows with the Front, Middle and Back office and review the business requirement documents.
Development of detailed System, Integration and End to End test cases based on the review of BRD, discussions with the business stake holders and review of the test cases with business teams.
Validated the configuration setup of currencies, tenors, product types, nostro accounts, counterparties, static data and fund settlement instructions with the vendors onsite as per the business needs
Developed and executed the integration test scripts to test all the downstream systems that consume the interface files generated from the WSS FX core system
Extensively tested the life cycle events associated with the FX products: Spots, Forwards, Swaps, NDFs and Options
Extensively tested the various reports and interfaces which feed the data to the downstream systems and analyzed the impact of the trades in the accounting
Executed the regression testing suite after each new deployment of the WSS core application and interfaces.
Discussed the user requirements with business team and product owner and has developed the detailed Use case document for SWAP Dealer Repository and MVP2 and 3 releases.
Coordinated with the ION Wall Street personnel present onsite and in the remote location in configuring the system and in generation of the interfaces.
Developed automation scripts using Selenium web driver, Eclipse, Cucumber to automate the regression testing of the web portal.
Enhanced the scripts with synchronization points to keep them in sync with the application and ran the scripts with multiple sets of test data
Prepared the traceability matrix to show the test coverage against the requirements at the end of each sprint and explained the matrix to the product owner during sprint demo
Participated in the daily scrum meetings to provide the project updates and in the Sprint grooming, planning meetings at the start of each sprint
Guided the Front, middle and back-office teams in testing during the UAT test phase and provided the necessary support
Prepared Business use case documents for the enhancements needed to the FX application from the feedback received from the business users and communicated them to the WSS development and project team
Actively participated in the Agile events Sprint demos, planning and retros to discuss the work done in the Sprints and explain the project team about the work done during the sprint.
Technical Environment
FX, Agile-Scrum process, Java, UC4, UNIX, API's, Oracle 11g, ALM, Selenium, JMeter, Eclipse, Cucumber, GIT and Maven, Version One, Knowledge Link and JIRA
CME Group - New York, NY
Sr. QA Lead Aug 2010 – Mar 2015
Project#1: Clearport Clearing
CME ClearPort is a comprehensive set of flexible clearing services for the global OTC market. It provides centralized clearing services and mitigates risk in the energy marketplace. As of today, CME ClearPort clears transactions across multiple asset classes around the world. With OTC clearing through CME ClearPort, one can continue to conduct business off-exchange–but you gain the advantages of security, efficiency and confidence. Clear port clearing offers web based and an API for trade entry and management application for submitting Futures and Options with different strategy types. As part of the project, I have worked with various financial instruments namely Futures, Options, FX, Swaps and CDS.
Roles and Responsibilities
Authored detailed Test Plans including test strategy, entry/exit criteria, execution timelines, and communication plans.
Led an offshore QA team, overseeing test case/script development and execution, and facilitated requirement walkthroughs and review meetings with onshore and offshore stakeholders.
Extensively tested CME Clearport APIs used by financial institutions to submit trades in Futures, Options, Swaps, FX Forwards, and CDS on Commodities.
Validated the processing of FIXML Trade messages (FIX Protocol standard) transmitted over HTTP and IBM MQ, including client acknowledgments and trade confirmations.
Tested CPC FIXML message flows to downstream systems such as Front-End Clearing, SDR, and ETR, ensuring compliance with Dodd-Frank regulations for SWAP trade processing.
Validated CME Reference Data APIs offering product and entity reference data to clients.
Tested multiple trade types including EFP, EFR/EOO, OPNT, Block, Swap Block, and strategy types like spread, butterfly, synthetic long/short, straddle, long put/call.
Verified enhancements and new UNIX jobs using UC4 Job Scheduler and command-line executions.
Developed and executed complex SQL queries to validate backend data and generated reports across multiple data sources.
Validated trade flow integration between CME Direct, GLOBEX, and Clearport to final processing within clearing systems.
Enhanced and executed automation scripts (GUI and API) to conduct robust regression testing post-functional sign-off for each release.
Developed GUI automation scripts using Selenium WebDriver and maintained them to reflect changes in trade entry and blotter modules.
Conducted thorough testing of Account Manager, a web-based app for account setup, credit limit configurations, and broker/asset manager assignments.
Validated Crystal Reports generated via Crystal Lite, ensuring data accuracy using backend SQL validations.
Employed BDD (Behavior Driven Development) methodology to write reusable, scalable test scripts.
Worked in Agile-Scrum and Kanban frameworks, participating in story point estimation, daily stand-ups, and retrospectives.
Proactively raised and resolved issues during scrum calls, maintaining consistent communication with the project team.
Technical Environment
Commodities Trading through API and GUI, Agile-Scrum process, Java, J2EE, JSP, Servlets, UC4, UNIX, Putty, Oracle 11g, TOAD, Crystal reports, UC4 Job manager, Quality Center 11, QTP, JIRA, Share Point, ClearQuest, Maven and GIT, VBA and Excel
Project#2: Electronic Settlement System
Electronic Settlement System (ESS) is the web-based application used by the New York Settlement team to process the end of day settlements for Futures and Options (Commodities) and send them for clearing. As a QA test lead, I have performed requirement analysis, test planning, test case preparation and execution with the teams in New York and Chicago. I have coordinated the Integration testing with the PRS and TIPS teams based in Chicago. Agile - Scrum methodology is being followed in application development, testing and deployment.
Roles and Responsibilities
Extensively tested the various settlement algorithms and calculators developed for the Futures and Options based on the underlying futures settlement prices, at the money volatility, interest rates, skews and pivots
Developed detailed test plan and test cases for functional, integration testing involving Electronic Settlement System, Clear port clearing, PRS and TIPS applications
Entered various trade strategy types in CPC namely spread, butterfly, pack spread, straddle, Long put & call and validated their settlement process
Extensively tested the settlement processing calculators of various futures namely Calendar Swaps, Outright Swaps, Balmo Swaps, Spreads & Balmo Spreads and Basis Swaps
Actively involved in the daily scrum meeting (as part of