Post Job Free

Resume

Sign in

Test Data Management

Location:
Irving, TX
Posted:
June 28, 2023

Contact this candidate

Resume:

Alarmel Ishwarya Mariyappan

Irving, Tx ***** +1-469-***-**** adxy7j@r.postjobfree.com

Objective

• 10 years extensive experience in IT Industry with primary expertise in Mainframe Technology.

• Extensive Knowledge in requirement gathering, analysis, design, development, implementation, testing, integration deployment, documentation, and maintenance of IBM Mainframe applications.

• Experience working in the domain of banking and financial, Investment and Insurance applications.

• Extensive hands on experience on IBM Mainframe Application programming.

• Experience in doing legacy modernization from mainframe to Distributed systems.

• Extensive knowledge and experience in JCL,COBOL,DB2,VSAM,TSO/ISPF,CONTROL- M(Scheduler),CHANGEMAN and Endevor, SPUFI,IBM UTILITIES,Assembler and Oracle

• Strong analytical, problem solving, multitasking and strategic planning skills

• Interested to learn New Technologies and undertake challenging tasks.

• Have Good knowledge on the Banking, insurance, mutual fund terminologies and functionalities.

• Ability to multi-task work across different applications. Technical Skillset

• COBOL

• SQL

• Z/OS

• CONTROL-M

• TSO /ISPF

• XPEDITER

• DB2 • CICS • IBM Utilities

• VSAM • Changeman • SPUFI

• Oracle 20.4

Experience

• Endevor

Nationwide Insurance – TDM(Test Data Management) [Dec‘21 till present]

Project background

Test Data Management is the administration of the Test data repository necessary for fulfilling the needs of quality assurances testing processes. Test Data Management is not just provisioning test Data, it includes Data Obfuscation (Masking) capabilities, Synthetic data generation .TDM runs a test data warehouse by putting minimal effort on database administrators, maintaining test data suites of production source data and re- provisioning of test data on demand without the need to going back to production.

Role & Responsibilities

• Analyze the requirement and provide the test data in an orderly and secure manner in a non-prod environment.

• Worked in Synthetic data generation where synthetic data is a representative of production data that is not real, that is often manually keyed in, replicated or generated via tools

• Worked in Data aging process where it is a feature to modify all types of date columns regardless of initial format and adjusting the resulting dates to suit specific business rules.

• Worked in Regression test suites where the test data repository for regression test data including application dependencies data that can be restored on demand

• Worked in Data refreshes where data is restored from backup to bring the data to its original state or refresh from previous or current production snapshots.

• Worked in Scenario based data provisioning where test data is provided specific to test case scenarios including data from application interfaces.

• Coordination, communication and collaboration with various application team members for smooth execution of the assignments.

Previous Experience

Highmark – DB2 Archival: [May ‘20 to May’21]

Project background

DB2 Archival Project mainly deals with the archival process of the Db2 tables in which the older data will be moved from the active base tables to the newly created Archival tables. Archival Criteria has to be finalized for each table by analyzing how particular table data are used in the application programs. Archival tables should be created resembling the attributes of the active base tables. Once the archival criteria is decided, then the required data are moved from base table to archival table using archive utilities. We mainly focus on the archival process to increase the performance of the application jobs.

Role & Responsibilities

• Analyze the ECS Application programs which are using the particular DB2 Tables to decide the Archival Criteria.

• Design and develop the archival logic using DB2 SQL as per the requirements

• Performed Root cause analysis to identify the cause of issues in the Test/Production environment by the team during SIT, UAT & implementation.

• Worked with the DBA’s for the Index creation of Archival Tables and all related stuffs.

• Worked with scheduling team for production implementation of new and existing archival jobs.

• Documented the results which are captured for a month and finally performance improvement graph is presented to the partners.

Highmark- COBOL Complier Upgrade: [May ‘20 to May’21]

Project background

COBOL Compiler Upgrade project deals with upgrading the COBOL complier from V4.2 to V6.2. So before upgrading in production region, the whole application programs have to be tested with the upgraded COBOL version. The Highmark application deals with many Easytrieve programs. Since COBOL programs are compiled at build time, those testing are not bothered. But the Easytrieve programs are tested individually. The Easytrieve programs utilize IBM’s Easytrieve migration utility which is invoked at run time and not at build time. The Easytrieve migration utility translates Easytrieve source code into COBOL source code, compiles the translated source code into a COBOL executable and then executes the compiled code to perform business operation.

Role & Responsibilities

• Creation and maintenance of mainframe execution JCL (jobs) to support production environments

• Involved in conversion of programs from Easytrieve to COBOL as and where it is required.

• Performed full job testing for already existing programs and new converted programs.

• Performed Root cause analysis to identify the cause of issues in the Test/Production environment by the team during SIT, UAT & implementation.

• Worked for the performance improvement by optimizing the runtime of production jobs in batch to meet the SLA.

• Co-ordinated the functional design, detailed design, Software inspection review meetings with tech leads for approvals before the production install.

Fidelity Investments – Investone: [Nov ‘15 to May’19]

Project background

INVESTONE is the core engine for processing all the investment accounting. INVESTONE engine is provided and maintained by FIS (Fidelity information solutions) company. New version of INVESTONE engine is released periodically by FIS.Investone is a Real-time, straight-through processing investment portfolio accounting system. Investone maintains official books and records for all the funds. Wrapper code is developed in order to process Investone engine. Wrapper code, the custom code is developed and maintained by Fidelity. Page 2

Role & Responsibilities

• Worked in a milestone project named R15.3 (Release 15.3 of Investone engine). R15.3 engine loads are installed in fidelity mainframes to meet the market standards. Overall wrapper code, Inbounds, Outbounds, Master files are enhanced as per the Investone 15.3 version.

• Worked for the performance improvement by optimizing the runtime of production jobs in batch to meet the SLA.

• Analyze, Design, Develop the business logics as per the client requirements using COBOL, JCL, SORT.

• Created functional designs as per the business requirement documents. Functional design flow charts are created using VISIO.

• Co-ordinated the functional design, detailed design, Software inspection review meetings with tech leads for approvals before the production install.

• Performed Root cause analysis to identify the cause of issues in the Test/Production environment by the team during SIT, UAT & implementation.

• Worked with scheduling team for production implementation of new and existing jobs.

• Production implementation upon successful user acceptance testing. BB&T – Branch Banking and Trust [Nov’12 to Nov’15]

Project background

BB&T is the consumer and commercial bank located in many places of U.S. It is one of the largest banks in U.S by its assets. It deals with all types of requests from the customer like commercial loans (CL), online banking, Card Management and Web reports. Commercial loan application deals with the loan customers. Online banking application deals with all the online requests from the user. Web reporting deals with all the reports that has to be sent to the customer via third party on monthly or weekly basis. Cards Management handles all types of cards.

Role & Responsibilities

• To gather and clarify the requirements from the clients.

• Design and develop the business logic using COBOL, DB2 SQL as per the requirements

• Worked in a milestone project called COLLAPSE, where the duplicate loan account numbers are made unique in production master files.

• Worked with the TIBCO team to parse the XML tags sent from the front end.

• Created and enhanced the DB2 modules to fetch the data from table and provide it to the end user in form of web reports.

• Creation and maintenance of mainframe execution JCL (jobs) to support production environments

• Performed Root cause analysis to identify the cause of issues in the Test/Production environment by the team during SIT, UAT & implementation.

• Enhanced and maintained the reporting System using batch and Easytrieve programs.

• Worked in documentation project where we need to understand assembler programs and document the pseudo logic to convert to Cobol

Education

Certifications

AZ-900 Microsoft Azure Fundamentals

Contact

Alarmel Ishwarya Mariyappan +1-469-***-**** adxy7j@r.postjobfree.com Title of the Degree College/University Year of Passing Bachelor’s degree (EEE) Government College of Engineering 2012



Contact this candidate