Sign in

Mainframe developer

Atlanta, GA
May 19, 2020

Contact this candidate


Professional Summary:

Over ** years of experience in analysis, design, development, implementation, administration and support experience.

IBM Certified DB2 DBA Professional

Participated in requirements analysis, reviews and working sessions to understand the systems and system design.

Check database status, monitor daily, monthly batch cycles, resolve any job abends and DB2 related issues and COBOL programs code issues in production environment.

Strong exposure in COBOL,JCL,DB2,CICS,IMS, developing Job flows, schedules and batch process.

Strong experience in working with full life cycle (SDLC) of Development of Mainframe based applications.

Strong hands on experience on Teradata tools such as BTEQ, FLOAD, MLOAD, TPUMP, FASTEXPORT, TPT( Teradata Parallel Transporter).

Experience in designing Low level and high-level design documents.

Designed and developed various mainframe programs using COBOL and Easytrives.

Design, Development of mainframe job flows using JCL.

Extensive experience with Sort Utilities like DFSORT and SYNCSORT and ICETOOL.

Strong Experience with usage of IBM DB2 Utilities for LOAD, UNLOAD, FASTLOAD.

Experience in creating new tools using REXX language and integrated them with CLIST to invoke them using line commands.

Proficiency in scheduling the batch flows using the CA7 scheduler.

Strong experience in VSAM file processing and Message Queue (MQ) processing

Experience is version control tools ChangeMan and Endeavor.

Strong experience in DB2 tools like PLATINUM, BMC to view and edit the properties of DB2 components, evaluate the performance of a query using EXPLAINS.

Expertise in using common MF tools like TSO/ISPF, FileAid, MQ,File Aid for DB2, SPUFI and SDSF.

Strong experience in DB2 tools like PLAT, BMC to view and edit the properties of DB2 components, evaluate the performance of a query using EXPLAINS.

Expertise in Backup and restore of the databases.

Expertise in Performance tuning & optimization.

Fixing bind issues that will occur during the application testing in various phases.

Maintainedcreate/alter/drop/grant/revoke/rebind/free DB2 objectusing IBM DB2 Admin tool.

Migration of database objects between databases and DB2 subsystems using IBM DB2 Admin tool.

Created and managed production emergency fix process on the mainframes.

Executing DB2 reports/utilities for performance/database organization & database growth analysis.

Sql review through Explain.

Bind plans & packages. Creation of Package, reviewing & providing DBA approval for production packages using CHANGEMAN tool.

Optimize the query access plans, resource and space utilization.

Implementing the changes through RFC (Request for change) process.

Creating the standard implementation plan, testing results documents.

Education & Certification:

Bachelor of Engineering, India

IBM Certified DB2 DBA Professional

Technical Skills:

Operating Systems

Z/OS, Linux, Windows,

Apache Hadoop, Cloudera, Hortonworks




Teradata 14/15, DB2 V 10, SQL Server

DB2 V9, V10, V12, IMS, Oracle, SQL Server

Database Admin Tools

Catalo Manager, BMC Change Manager, DASD Manager, Apptune, IBM Utilities, IBM Data Studio, SPUFI and QMF, Platinum

File System


Data Modeling


Job Schedulers

Control-M, OPC and CA7

Ticketing tool

Service Now, Remedy

File Management Tool

File-aid, File Manger DB2

Database skills

Views, Materialized views, Bteq, Mload, Fload, Tpump and TPT, Hive Query

Big Data Analytics



Client: Anthem, Atlanta, GA April 2018 – Till Date

Role : Mainframe/Teradata Developer


Gather Business Requirements by collecting information from the Business Users. Conduct Business Impact Analysis.

Develop and maintaining of scripts using Teradata utilities like Multiload (MLoad), Fastload, BTEQ, TPT and TPUMP for the new and existing data marts to feed the Data Warehouse

Designed and developed data warehouse using Hive external tables, partitioned tables in Hive and created hive queries for analysis.

Involved in creating end-to-end automation using UNIX shell scripts; this includes processing files to and from FTP server, executing informatica workflows to process data into staging area and then into Teradata warehouse.

Document the High-Level Design and Low Level Design for the Requirement

Participate in release planning meetings and provide inputs on demand prioritization

Expertise in transforming the requirements to technical definitions, coding, testing and implementing the same.

Meeting with application developers to code the components

Application Maintenance - Responsible for providing Application maintenance and to track the batch completions, act on the abends and issues and solve the problems with proper resolution preventing it from further issues and ensuring system availability according to SLA

Performing configuration management and process improvement - Responsible for finding areas and provide enhancements to those areas related to the project that will add value to the process and will improve performance.

Developed jobs to exporting and importing data between RDBMS and HDFS using Sqoop

Involved in creating end-to-end automation using UNIX shell scripts.

Implementation of Physical modeling on Teradata such as creation of tables, indexes, views, normalization, users, roles and profiles.

Worked on Informatica-Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.

Developed Fast load scripts to load data from host file in to Landing Zone table.

Initial load process, we will Mload the Source file in to Teradata staging table and archive the file in to archive folder after Mload completes successfully.

Apply the business transformation using BTEQ scripts.

Created Indexes and written complex SQL statements.

Loading, unloading data into and from DB2 tables using the tool DB2 FILEAID.

Experience in writing new source code using COBOL/DB2.

Track project plans and progress

Review and approve Test objectives

Review the test results

Perform defect Triaging and Support related to the projects assigned

Managing the Knowledge base activities

Preparations of Pre-delivery & Post-delivery follow up documents

Keeping and maintaining all the documents in the project discovery and SharePoint sites

Created the new database objects like databases, table spaces, tables, index, views, and stored procedures. Altered the attributes of existing database objects.

Application development support on creation of collection id, package, plan creation, bind/rebind, dclgen. Loaded and unloaded the data using IBM load and unload utility.

Worked on writing and modifying native stored procedures as per business requirements.

Performed the unit testing, system testing on stored procedures.

Deployed the database objects from test region to system region/production region.

Experience in writing complex queries using SQL

Providing 24 hour on call support.

Analyzing Client requirements and preparation of low level design document.

Participation in review of functional specification document, design and functional test cases. Provided weekly status report to the client.

Offshore deliveries review to ensure quality and to check compliance with the client coding Standards and guidelines.

Client: Merck, West Point, PA May 2014 - March 2018

Role : Mainframe Developer/DB2 DBA


Responsible for Requirements gathering, Impact analysis, Project estimation, scheduling and tracking the deliverables.

Managed the Change Request procedures for the identified bug fixes and served as the point of contact for all the enhancement installations.

Mentored the offshore team and gave them technical guidance and helped them understand the business.

Responsible for the root-cause analysis, incident research and ticket resolution.

Analysed COBOL-DB2 modules as part of incident research and identified the issues and fixes.

Over saw the code development and responsible for all the deliverables and interactions with the client.

Executed Easytrieve codes, Sort utilities like SyncSort and IBM DB utilities using JCL and framed the job flows for batch processing of application data.

Used Endeavor tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Used File-Aid for DB2, QMF and SPUFI for accessing the DB2 database. Created DCLGENs and used them in COBOL modules.

Used NDM and FTP has file transfer protocols between mainframe LPARs and between mainframe and midrange servers respectively.

Used CA-7 as the scheduler tool for all the batch flows inside the application.

Created new scripts using REXX and automated Process. Experienced in SPUFI, SDSF, QMF, File Aid, BMC Change Manager, BMC Catalog manager, Apptune.Rotational On-call production support

Creation, instance upgrade and version migration. Implement a library of homegrown scripts to automate the scheduling and notification of periodic maintenance jobs, including REORG, BACKUP, RUNSTATS, LOAD and RECOVERY

Creating batch jobs and procs to extract programs and moving the code to production.

Coding programs which writes data to XML file by reading DB2 tables.

Validating XML files According to schema. Provided technical guidance to the team.

Preparation of test plans, test cases, code reviews and delivery documents.

Client: Ameriprise Financial, Minnesota- MN Jan 2011 – Apr 2014

Role : DB2 DBA


Created and managed production emergency fix process on the mainframes.

Executed DB2 reports/utilities for performance/database organization & database growth analysis.

Generate the Image copy gaps reports and took the backup copy of an table spaces and alert the application DBA teams to fill the IC gaps

Involved in re-engineering the application to identify the highly CPU consuming jobs and improved the performance of those applications.

Responsible for preparing the project estimates and high-level design for the identified requirements.

Prepared the low level design document by logically designing the job flow to satisfy all the requirements.

Created COBOL-DB2 modules wherever required to apply business rules and format data, insert/update or delete from DB.

Executed Easytrieve codes, Sort utilities like SyncSort and IBM DB utilities using JCL and framed the job flows for batch processing of application data.

Used Changeman tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Mentored the team and peer reviewed the code modules and had the ownership of the application at offshore.

Created Native Stored Procedures that can be executed by Web service APIs to fetch or insert/update data from the DB.

Used BMC tool EXPLAINS to analyze the queries and optimize them to yield performance.

Used DB entities like Views, Indexes and DB concepts like Triggers to handle and maintain the data.

Used IceTool, Join Keys concepts in Syncsort utility to compare data chunks in a more efficient manner.

Created and maintained code review checklists for MF part of the application.

Created Integration testing scripts and executed the tests and documented the test results using Quality Center tool.

Supported performance testing team on the data setup and oversaw the performance testing process and responsible for giving the sign off from the development team.

Creation of cards for scheduling the jobs on the CA-7 scheduler.

Responsible for production implementation and install verification, preparation of installation documents.

Expertise in Database design, Installation, Upgradations, Configuration, Backup, Recovery, Database Security, and Query optimization

Experienced in Performance tuning, Optimization, capacity planning.

Unloading, Loading the databases and collecting statistics for the same

Client:Selective Insurance, Glastonbury, CT Nov 2007 – Jan 2011

Role: Mainframe Developer


Analyzing and Gathering and documenting Requirements as per client by directly interacting with them and Physical database design and implementation.

Physical table and index design, referential integrity, check constraints based on application requirement. Test environment set-up, and production deployment

Defining stored procedures and creation of Joins, Views Triggers,etc.Experience in JCL, application programs performance monitoring and tuning

Check database status, monitor daily, monthly batch cycle, resolve any job abends in production Environment. Authorization and privileges for DB2 objects.

Created and altered the database objects tablespace, tables, indexes, views, stored procedures.Created the physical data modeling diagrams using ER-Studio.

Granted the appropriate privileges to the users on the database objects.

Prepared the housekeeping utility jobs image copy, reorg and runstat for new database objects and released these housekeeping jobs into control-M.

Re engineered the database objects into physical data model using ER-Studio.

Involved in improve the performance of the SQL queries.

Deployed the database objects from test region to system/production region using BMC change Manager.

Automation of database maintenance activities in production like finding the image copy gaps and alerts for space threshold.

Handled and administered the large data volumes in production tables.

Storage management including space allocation, space enhancement based on the growth, re-distribution of data or re-designs of database objects based on existing growth pattern or anticipated growth.

Application development support on creation of collection id, package, plan creation, bind/rebind, dclgen.

Technical support to the programmers and users.

Debugging application programs, resolution of database errors, support for application

Errors and bad application performance.

Supported performance issues of the database system.

Worked closely with the development teams and data architects.

Installed and scheduled backups, reorganization, recovery, and check point process.

Coordinated database changes in production to prevent outage, avoid application impacts, and smoothly execute Modifications.

Provided 24-hour 7-day production system support to ensure each service level agreement is achieved.

Granting and Revoking Utilities access on Table Spaces.

Performance tuning of the Test and development databases using explain tools.

Schedule and execute reorg/runstat on the database tables.

Contact this candidate