Post Job Free

Resume

Sign in

Etl Developer Data

Location:
Tampa, FL
Posted:
May 18, 2023

Contact this candidate

Resume:

Vijay Kumar Katneni

PROFESSIONAL SUMMARY

Above 16+ years’ experience in Data Warehouse (OLAP) using Teradata and Ascential Datastage, IBM Information server (Latest version of datastage i.e.V8.0.1 & V8.1) Decision Support Systems, Client/Server business systems design, development, testing and production.

Excellent knowledge and experience in data warehouse development life cycle, dimensional modeling, repository management and administration,

implementation of STAR, Snowflake schemas and slowly changing dimensions.

Strong experience in SAP integration with datastage by using SAP R/3 pack stages.

Strong experience in Data Warehousing, Data Architecture, Data Modeling, Data Mining, Data Analysis, Decision Support System (DSS) and GUI applications.

Extensive knowledge in writing, testing and implementation of the triggers, stored procedures, functions, packages using PL/SQL.

Designed BO universe on Siebel CRM applications.

A good exposure in developing Server jobs, Parallel jobs, and Shared containers.

Designed and developed jobs using Parallel Extender for splitting bulk data in to subsets and to dynamically distribute to all available nodes to achieve best job performance

Experience in using various Server stages like Aggregator, Filter, Merge, Hash, Sort, Universe, Stored Procedure, DB2Load, Sybase Load, Folder, FTP, Pivot, and XML Input/Output.

Familiarity with OLAP reporting and business intelligence tools like Business Objects

Performed Debugging, Trouble Shooting, Monitoring and performance tuning using Datastage.

Experience with Parallel Extender for Parallel Processing to improve

job performance while working with bulk data sources.

Extensively used SQL, PL/SQL to program Stored Procedures, Functions and database Triggers, Cursors to enforce complex integrity constraints and for auditing purposes.

Strong knowledge of UNIX shell scripting (K, C shell), C and C++.

Exceptional verbal and written communication skills, excellent team player.

PROFESSIONAL EXPERIENCE

Bank Of America

Mar 2022 – Mar 2023

Sr. ETL Datastage Architect

Tampa, FL

Project - Core Customer Data Platform (CCDP) Applications.

Project Description:

As part of this project worked for FRE/CDI and ECP applications.

FRE (Federated Relationship):

This application stores and maintains Customer Relationship Indicators and online document index.

CDI - Customer Data Integration which groups all customer information across all communication channels and lines of business.

ECP - This application stores and maintains customer preferences.

Responsibilities:

As ETL Architect engaged in discussion with client for Requirement gathering, Design of

various Integration projects which involves heterogeneous data sources.

As ETL Architect designed Integration Specification document to capture Business

requirements which can be used as standard for all EI projects. Also designed ETL Solution

Turnover document as standard to describe details about ETL code development.

Created ETL Standards document including Best Practices to be followed while developing

code.

Created ETL reusable objects ie Shared Containers and Multiple Instance Jobs.

Created job design reusability by using RCP and Multiple instance jobs with schema file.

As ETL developer used Data Stage Designer v8.7, 11.5 to and used stages like Oracle and

DB2 connectors, IDoc Connectors, Data set, Join, Merge, Lookup, Aggregator, Change

capture, Copy, Filter, Funnel, Remove duplicates, Pivot, Modify, Sort, Transformer to achieve

ETL functionality.

Created Data Stage jobs to connect with Real time processes like Web Services and JAVA objects using Web Services Transformer Stage, JAVA Client Stage, XML/JSON stage

and MQ connector stage.

Used System variables like @INROWNUM, @PARTITIONNUM, and Environmental

variables like $APT_DUMP_SCORE, $OSH_PRINT_SCHEMAS and applied the parallelism techniques (pipeline and partition).

Extensively used user defined SQL’s with complex join to achieve transformation.

New York State of Health (NYSOH

Apr 2015 – Jan 2022

Sr. ETL Datastage Architect/Lead Developer

NY

The New York State of Health (NYSOH) Health Plan Market place, also known as the New York Health

Benefit Exchange (NY-HBE), is an online exchange where eligible individuals and small business

Employees shop to obtain health insurance coverage. These plans range from the Insurance Affordability

Programs (IAPs), which include the Medicaid Managed Care (MMCs) and Child Health Plus (CHP)

plans, to the Qualified Health Plans (QHPs).

Involved in installing datastage server version 9.1.2/11.3/11.5/11.7.

Worked with BA’s and Client(NYSOH) to gather business requirements, convert them into technical specification documents, develop the datastage code, deliver to SIT/UAT and address the defects identified by test team.

24/7 Prod Support

Involved in datastage upgrade from V9.1.2 to V11.5.

Involved in datastage migration when the OS linux RedHAt was updated from RHEL5 to RHEL6.

Automated the all datastage interfaces.

Involved in configuration of Platform LSF 9.1 on Datastage GRID Environment.

Invoking the java code by suing Java Client stage Java Transformer stage.

developed Java Custom Objects by using Java API stage.

Involved on quality stage configuration.

Configure the IIS Admin console.

Involved in creating projects in DataStage, giving proper accesses to the developers, configuring the DataStage application to use ldap/ad application.

Involved in working with IBM when the product support is needed through PMR’s.

Involved in the Project requirements gathering and actively participated in the JAD sessions

Involved in gathering the requirements to upgrade the data stage version and migrate the projects to different versions.

Worked with IBM to resolve the product issue through opening CASES(PMR’s)

Develop integration solutions to meet the requirements using components of the MuleSoft Anypoint Platform, including Design Center, Anypoint Studio, API Manager, Data Weave, Message Queue, and assets from the Anypoint Exchange.

Developed datastage process which will create MAEE reports on weekly basis.

Worked on Capturing the Audit details of all the transactions and various other details.

Involved in the DataStage performance tuning.

Involved in writing the shell scripts to sftp the files from one server to other servers.

Involved in the complete software development life cycle (SDLC).

Involved in designing and developing ETL frame work and development.

Involved in troubleshooting ETL issues.

Involved in ETL testing.

Environment: IBM Information Server Datastage9.1, 8.7, 11.5,11.7, IBM LFS load balancing, Linux Redhat 5, 6&7, DB2 10.1, 10.5, Oracle 11,12.1, Autosys, TWS,JAVA,MuleSoft

AT&T

Jan 2015 – Apr 2015

Sr. DataStage Architect/Lead/Admin

NJ

Responsibilities

Involved in ETL design and development frame work

Upgraded IBM information server from 7.1 to and 8.5 on Solaris environment.

Involved in SAP R/3 pack installation in both version 6.0 and 7.5

Designed the SAP jobs by using IDOC, ABAP and BAPI stages.

Involved in creating idoc’s by using IDOC connector stage.

Involved in fix pack installation for datastage

Involved in WAS upgrade.

Involved in ETL integration testing.

Configure the High Availability for DR.

Configure the datastage application to LDAP Authentication.

Created new Datastage projects and file systems and make changes in datastage configuration.

Assign the proper permissions for developers, testers.

Migrated projects from 7.1 to IS8.5 GRID.

Written Unix scripts and Routines such that project teams can count lines in a file in the Trigger condition, call C programs in Execute command stage and get parameters in the User-defined stages.

Exposing the datastage services as webservices(java)

Involved in creating users, maintaining user groups and assigning roles.

Scheduled job runs using DataStage director and unix crontab and used Datastage director for debugging and testing.

Worked on the change management process with in Guitar Center for changes to Production. Involved in the Disaster Recovery tests with in Guitar Center.

Debug problems between DataStage server and client connectivity and Regulated the DataStage for optimum performance at all times.

Built and created Wrapped stages such that project teams Custom stages.

Monitored the logs for the Daily Exports and Backups.

Involved in schedule the various process streams using TWS.

Involved in Parallel Extender node configuration for various projects

Excessive usage of Datastage Director for monitoring Job logs to resolve issues.

Used Performance analyzer and resource estimator to estimate and monitor performance of stages/jobs.

Created PMR’s and worked with IBM SWG for the product support.

Given on-call support on 24/7 basis

Used impact analysis to identify which jobs/stages using a particular metadata.

Environment: IBM Information Server Datastage7.1,8.7 Sun Solaris, SAP R/3 Pack, DB2, Oracle, SQL Server

LLBean

Jan 2014 – Dec 2014

Sr ETL Admin/Developer

ME

Description:

LLBean is leading brand in apparel and outdoor activities equipment. LLBean decided to bring its inventory system and sales info into latest technologies from legacy systems like mainframe. The motto of this project is to move their operations to SAP,JDA by using middleware tools like datastage, WMB and etc…

Datastage Admin Tasks

Maintain all the ETL servers for all the environments. Provide a Common Framework for the projects leveraging their efforts of maintaining the environment.

Building projects for the teams such that they will be able to use the already structured directories and sub-directories. Also, re-built corrupted projects by running few steps and Importing from the Daily backups.

Migrated projects from 8.1 to IS8.7 GRID.

Written Unix scripts and Routines such that project teams can count lines in a file in the Trigger condition, call C programs in Execute command stage and get parameters in the User-defined stages.

Involved in creating users, maintaining user groups and assigning roles

Scheduled job runs using DataStage director, and used DataStage director for debugging and testing.

Involved in WAS upgrade.

Configure the High Availability for DR.

Configure the Datastage application to LDAP Authentication

Creating Datastage packages by using information server manager

Worked on the change management process with in Guitar Center for changes to Production. Involved in the Disaster Recovery tests with in Guitar Center.

Debug problems between DataStage server and client connectivity and Regulated the DataStage for optimum performance at all times.

Built and created Wrapped stages such that project teams Custom stages.

Monitored the logs for the Daily Exports and Backups.

Involved in schedule the various process streams using TWS.

Involved in writing shell scripts for ftp/sftp files.

Involved in executing jobs on demand basis based on business people’s request.

Involved in creating master sequences.

Datastge Development Tasks

Participated in the design review meetings to come up with the proper methodology that is to be followed uniformly across the development.

Created IDOC’s by using idoc connector.

Designed the SAP jobs by using IDOC, ABAP and BAPI stages.

Collaborated with project team members in providing design and development guidance, mentoring, best practices.

Implemented data extraction and load processes in a parallel framework.

Created the Schema file to dynamically pass the metadata in Sequential file stage.

Created Shared Container to encapsulate the logic that is commonly used across the project.

Created ETL/DI designs for various source and target Databases

Created Unix flow to unload data from Teradata and load to SQL Server Database

Experienced on working TERADATA PARALLAEL TRANSPORTER

Created the multi-instance Generic Load Parallel Job that is called in the Job Sequence to load the data into the Teradata Database.

Extracted data from idoc’s and loaded idocs into SAP system by using idoc connector in SAP R/3 pack.

Extracted data from SAP Tables by using ABAP stage.

Worked with widely used stages like Flat File, Lookup, Join, Pivot, Transformer, Sort, Aggregator, Merge, Row Generator, and Column Generator and also Troubleshooted the designed jobs and tested the jobs for all logical errors.

Created XML files by using new XML stage.

Environment: IBM Information Server Datastage8.1/8.7/9.1, IBM LFS load balancing,DB2 V8, SAPR/3/ABAP, Solaris, UNIX AIX 6, Shell Scripts, PUTTY, TOAD. TWS, MQ,SQL Server

IBM- Blue Harmony

Mar2010 – Dec 2013

Sr. DataStage Admin/Developer

CT

Description:

IBM’s major transformation program to radically simplify the design and operation of three Globally Integrated Support Processes - Finance, Opportunity to Order, and Order to Cash Blue Harmony is part of IBM's ongoing evolution toward the Globally Integrated Enterprise (GIE) and demonstrates IBM's strategy in action. Long Journey in Blue Harmony gave me opportunities to aspire my skills in various roles and responsibilities

Datastage Admin Tasks

Upgraded IBM information server from 8.1 to and 8.7 on AIX environment.

Involved in SAP R/3 pack installation in both version 6.0 and 7.5

Involved in fix pack installation for datastage

Involved in WAS upgrade.

Configure the High Availability for DR.

Configure the datastage application to LDAP Authentication.

Created new Datastage projects and file systems and make changes in datastage configuration.

Assign the proper permissions for developers, testers.

Migrated projects from 8.1 to IS8.7 GRID.

Written Unix scripts and Routines such that project teams can count lines in a file in the Trigger condition, call C programs in Execute command stage and get parameters in the User-defined stages.

Involved in creating users, maintaining user groups and assigning roles.

Scheduled job runs using DataStage director and unix crontab and used Datastage director for debugging and testing.

Worked on the change management process with in Guitar Center for changes to Production. Involved in the Disaster Recovery tests with in Guitar Center.

Debug problems between DataStage server and client connectivity and Regulated the DataStage for optimum performance at all times.

Built and created Wrapped stages such that project teams Custom stages.

Monitored the logs for the Daily Exports and Backups.

Involved in schedule the various process streams using TWS.

Involved in Parallel Extender node configuration for various projects

Excessive usage of Datastage Director for monitoring Job logs to resolve issues.

Used Performance analyzer and resource estimator to estimate and monitor performance of stages/jobs.

Created PMR’s and worked with IBM SWG for the product support.

Given on-call support on 24/7 basis

Used impact analysis to identify which jobs/stages using a particular metadata.

Datastage Development Tasks

Involved in code promotion process.

Designed parallel jobs using stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup, Pivot, Sort, Surrogate key Generator, Change Data Capture, Modify, Row Generator and Aggregator.

Worked with the Business analysts and Subject Matter Experts to thoroughly understand the different business processes and data at various sources.

Documented ETL high level design and detail level design documents.

Involved in creating functional and scope documentation for data cleansing, conversion, and integration processes.

Participated in the review of Technical, Business Transformation Requirements Document Developed jobs using Quality Stage to strip the unwanted data.

Investigated and fixed duplicate data records making into the landing tables.

Involved in creating Message Queues to extract the data from Mainframes using MQ Connector stage.

Created the IDOC’s by using IDOC connector.

Designed the SAP jobs by using ABAP, IDOC and BAPI stages.

Involved in creating ds jobs to extract/load the IDOC’s from/to SAP using IDOC stage.

Involved in designing ds jobs to know whether the SAP process is completed successfully or not using BAPI stage.

Involved in schedule the various process streams using TWS.

Involved in writing shell scripts for ftp/sftp files.

Involved in executing jobs on demand basis based on business people’s request.

Involved in creating master sequences.

Environment: IBM Information Server Datastage8.1, 8.7 AIX,SAP R/3 Pack, DB2, Clear Quest, Clear Case and Build Forge

Astellas Pharma Inc

Nov 2009 Mar 2010

Sr.ETL Developer/Admin

IL

Responsibilities

Astellas is a Japan based pharmaceutical company. The project name is called ASPEN, meant for aggregating the expenses provided to federal government and Health care professionals.

Datastage Development Responsibilities:

Used Parallel Extender for partition parallelism, the same job would effectively will run simultaneously by several processing nodes, each handling separate subset of total data.

Designed several parallel jobs using Sequential File, Dataset, Join, Merge, Lookup, Remove duplicates, Funnel, Filter, Copy, Column Generator, Peek, Modify, Oracle Enterprise, Aggregator, Transformer Stages.

Responsible for FTPing the files produced to the respective consumer servers.

Analyzed the performance of the jobs and project and enhance the performance using the standard techniques.

Created job sequences and schedules for automation.

Used DataStage to transform the data to multiple stages, and prepared documentation.

Documented the purpose of mapping to understand the process and incorporate the changes as and when necessary. Finally send it for testing and production environment.

Involved in the development and implementation of ad-hoc reporting system using Business Objects.

Used Join/Merge/Lookup Stages to replicate integration logic based on the volumes of the data.

Responsible for unit, system and integration testing. Development test scripts, test plan and test data.

Designed the logical and physical data warehouse schema. Analyzed source systems and creating a mapping to the target database schema.

Involved in writing shell scripts for ftp the files to different servers.

Involved in gathering the requirements.

Involved in writing functional and technical specifications.

Involved in SIT and UAT to check data consistency.

Datastage Admin tasks

Involved in designing POC’s.

Involved in convert server jobs into parallel jobs.

Involved in installing fix packs on server side on IIS 8.

Upgraded IBM information server from7.5.1A to and 8.1 on Solaris environment.

Created new Datastage projects and file systems and make changes in datastage configuration.

Assign the proper permissions for developers, testers.

Monitoring and administrating DataStage server performance and usage

Implemented the various patches on the system to ensure its effective working.

Involved in Parallel Extender node configuration for various projects

Excessive usage of Datastage Director for monitoring Job logs to resolve issues.

Involved in creating users, maintaining user groups and assigning roles.

Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors.

Used Performance analyzer and resource estimator to estimate and monitor performance of stages/jobs.

Automated and Maintained backups of DataStage projects and restored them in emergency.

Used impact analysis to identify which jobs/stages using a particular metadata.

Environment: IBM Information Server Datastage8.1, Siebel, Oracle10g, Linux, Shell Scripts, PUTTY, TOAD

HHSC (Health & Human Services commission)

May 2009 – Nov 2009

Sr.Datastage Developer/Admin

Austin, TX

Tiers (Texas Integrated Eligibility Redesign system)

Tiers is the Texas State Government project designed for better serving the senior citizens. This project was developed in Informatica and was converted into Datastage as the informatica was not able to handle the huge data volumes.

Datastage Development Responsibilities:

Involved in designing the POC’s.

Help team to convert informatica mappings into Datastage jobs.

Involved in Datastage V8.1 Installation.

Involved in Trouble shooting the Aborted datastage jobs

Involved in Data modeling.

Involved in writing the shell scripts.

Involved in writing the documents .i.e. Functional specifications and technical specifications.

Involved in Datastage jobs performance tuning.

Involved in creating datastage template jobs for identifying the delta’s.

Involved in install/upgrade Information Server product suite on different tiers.

Involved in Create/Delete projects, create/modify environmental parameters. Multiple APT_CONFIG_FILE’s.

Unlock, list, remove the datastage jobs and remove the VOC files, configure and verify the environment variable settings in Datastage administrator.

Import and export the datastage code and changing the compile trace mode option using datastage manager.

Used Both Pipeline Parallelism and Partition Parallelism for improving performance.

Designed XML stages for reading XML log files for capturing data stage jobs audit data.

Datastage Admin Tasks:

Involved in installing Datastage, changing license details.

Upgraded Datastage from version from 7.5 to 8.1

Used IBM clear quest and PVCS release management for Datastage code release purposes.

Managed the entire “ETL process” involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales area.

Raised PMR with IBM for several config/server issues.

Involved in the creation of oracle Tables, Table Partitions, and Indexes. Involved in the development and testing of individual data marts, and update processes. .

Add, change, delete DS user variables and Unlock, list, remove DataStage job using DataStage Administrator.

Deploying and compilation of the datastage jobs

Shutdown and Bringing up the datastage and RTI servers

Purging the logfiles

Environment: IBM Information Server Datastage8.1, Informatica Power Center V8.5.1, Oracle10g,db2 v8, PL/SQL, AIX 5.2, Shell Scripts, Erwin5.5, PUTTY, TOAD,Clear Quest 7.0.1.0 and Clear Case 7.0.1.0

California State University, Office Of The Chancellor

Dec 2008 – Mar 2009

Solution Architect

Long Beach, CA

Responsibilities

Finance

Finance is the California State University internal project for creating centralized dataware house for its 24 campuses. They have different data sources like PeopleSoft and oracle etc… Datastage is used to extract data from PeopleSoft database and load that data into Oracle 10g database, which is centralized dataware house.

Datastage Development Responsibilities:

Created a template job for identifying Delta’s in an efficient way.

Involved in datastage V8.1 installation and create/delete projects.

Help team to convert server jobs into parallel jobs.

Load data into dimension tables, generating surrogate keys for dimension tables and load dimensional data into fact table with some business transformations.

Developed Data stage parallel jobs using various stages.

Export and import datastage jobs in different environments.

Migrate the Project from Version 7.5.1A to Version 8.1.

Involved in writing functional and technical specifications.

Involved in creating job sequences.

Involved in writing shell scripts.

Involved in project planning.

Environment: IBM Information Server Datastage8.1, Ascential Data Stage 7.5.1A, Oracle10g,db2 v8, PL/SQL, PeopleSoft, LINUX, Shell Scripts, Erwin5.5,Golden Guddy, PUTTY, TOAD.

Chevron

Jan 2008 – Dec 2008

Sr.DataStage Admin/Developer

San Roman, CA

ORCA

Chevron is one of the major oil companies in the world. The purpose of the project is automating the POS (Point of Sale) to BOS (Back Office System). They have different interfaces which deal with data transfer from BOS to the Vendors and some from Vendors to the BOS. DataStage is used to integrate these interfaces to load the data from BOS to Vendors and vice versa.

Datastage Development Responsibilities:

Implemented migration process from different systems.

Extensively used Data stage to load data from DB2 and Mainframe COBOL files to Oracle.

Created number of staging jobs to load the data into target table.

Created number of complex ETL jobs in this life cycle.

Involved in migration, error handling and reporting,data quality issues,

Extensively used stages like DB2, Flat File, Lookup, Joiner, Pivot, Transformer, Sort, Aggregator, Merge, Copy, Row Generator, Column Generator, CDC.

Troubleshoot the designed jobs and tested the jobs for all logical errors

Involved in Code Reviews

Supported every available report using universes created in Business Objects XIR2

Analyzed the Data extraction from source to check for data irregularities and identify dirty data to apply suitable transformation in the mappings in the jobs

Designed, build, and maintained Business Objects universes.

Involved in preparation on Test Cases and Test Scripts followed by the execution of the same

Worked on Export and import of Data Stage jobs

Used different partitioning methods like Auto, Hash, Same, Entire etc

Involved in Unit testing, System testing to check data consistency

Written Oracle PL/SQL queries for data validations and other transformations.

Prepared all documents necessary for knowledge transfer such as ETL strategy, ETL development standards, ETL processes, etc

Developed SQL scripts to augment the ETL process and to check on final target data

involved in Program Specification document preparation,

Developed Data stage parallel jobs using various stages

Involved in creation of Sequencers using Data stage Designer.

Involved in the development of Data stage parallel jobs using various stages and Sequencers and prepared scripts to run them.

Involved in writing the shell scripts for extracting or load files in different servers.

Worked in importing and cleansing of data from various sources like DB2, Oracle, flat files onto SQL Server 2005 with high volume data.

Design the ETL and ELT jobs based on the business requirement using Datastage Designer.

Program Specification Document preparation and review, ETL Development, review and testing, Performance Tuning of the system.

Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Data stage parallel Extender.

Created data jobs to extract data from different source files, transform the data using stages like Transformer stage, Aggregator stage to classify rows from a single input link to groups, Remove Duplicate stage to remove duplicate rows, Sort stage to sort the data in a particular order, Copy stage to copy data into more number of links, Modify stage to alter record schema of its input data set, Filter stage to filter the data, Join stage and then load to data warehouse

Datastage Admin Tasks:

Created projects using DSAdmin.

Involved in configure the DataStage server for I/O usage.

Created file system for the projects and assigning proper permission to the developers and testers.

Involved in converting Perl scripts into shell scripts and writing new shell scripts.

Datastage code migration/export using datastage manger.

Run, stop, schedule dsjobs and view their log and save it in local machine using ds director, list, unlock, delete jobs using DSAdmin.

Created apt file for the project.

Worked with the team in setting up new projects.

Used the Data stage Administrator to assign Privileges to users or user groups (to control which Data Stage client applications or jobs they see or run), move, rename, or delete projects, and manage publish jobs from development to production status.

Environment: Ascential Data Stage 7.5.1A, Business Objects XIR2, Oracle10g, PL/SQL, Mainframes, Cobol,JCL, DB2,SQL Server 2005,LINUX,Cognos, Shell Scripts, Perl, Cobol,Crontab, WS FTP, PUTTY, TOAD, OLAP, Visual Studio, Manual Testing, Microsoft SQL server Management Studio,MsAccess and VSS

Southern California Edison (SCE)

June 2007 to Dec 2007

Sr. Datastage Developer/Administrator

Rosemead, CA

Market Redesign and Technology Upgrade (MRTU)

Southern California Edison (SCE) intends to procure and commission new market applications for the Power Procurement Business Unit (PPBU) in preparation for the California Independent System operator (CAISO) Market Redesign & Technology Upgrade (MRTU) program. SCE has determined that a number of its existing applications or services need to be replaced or modified to operate under MRTU. Additionally, in order to reduce the PPBU operational cost and to enhance PPBU's ability to comply with standards of quality and performance, the integration technologies around these applications need to be improved.

Finally, in order to allow PPBU to continue to operate at lowest cost and meet standards and to be able to continuously improve its performance, the applications and integration technologies need to support planned and potential future business process changes and adaptation to new market protocols in a cost effective manner.

Datastage Development Responsibilities:

Worked closely with business analysts in requirements gathering, reviewing business rules and identifying data sources.

Generated HLD, LLD, Source to Target mapping, test cases and miscellaneous documents.

Performed analysis and designed the business flow and mapping jobs.

Created job to interact (END-to-END) for the given data sources.

Created ETL jobs and deployed as Web Service in order to communicate with the internal systems.

Defined FTP scripts for file moving process from mainframes

Moved files with FTP utilities and archived on UNIX server

Responsible for Extracting,



Contact this candidate