Post Job Free

Resume

Sign in

Ab Initio Lead/Architect

Location:
Dallas, TX
Posted:
November 09, 2020

Contact this candidate

Resume:

RAJIB SARKAR

469-***-**** / adhn0h@r.postjobfree.com

SUMMARY

Over 23 years of total IT experience in analysis, architecture, design, development, testing, implementation, maintenance and documentation of various applications in all phases of project life cycle.

16+ years of experience in the field of Data warehousing/Data Integration implementing ETL based solutions.

Hands-on 18+ years of experience on Ab Initio ETL tool, Data Mapping, extracting, transforming and loading data in complex, high-volume environments.

Profile and Analyze simple to complex data usage using Ab Initio Data Quality Environment (DQE) or Data profiler

Hands-on experience on Ab Initio DQE, ACE, BRE, Metadata Hub, Express->IT and Query>IT tools.

Hands-on 3+ year of experience on IBM Datastage ETL tool

Expertise in UNIX shell scripting, Oracle PL/SQL

Hands-on 1+ year of experience on Informatica ETL tool and Microsoft SSIS.

7+ years of experience in team management leading teams of various sizes (largest team size 20) from requirement gathering through to implementation.

Visualizing data using Tableau.

Performance tuning the complex Ab Initio graph, used Ab Initio best practice approach to implement the ETL process

Configured graph parameters, sandbox parameters, environment variables and EME Metadata Repository environment for Production/Development and performance tuning for load graph process.

Involved in data migration programs between database like Oracle, SQL Server and DB2, Teradata and PC file formats and added enhancements to existing applications according to regulatory requirements.

Proficient with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.

Experienced in Job scheduling tools like Control-M, Autosys, Dollar Universe, Ab Initio Operation Console

Experienced in writing complex SQL queries to perform the Back End Testing and Database Testing of the Oracle database using SQL

Experience and proficiency with tools like JIRA/Confluence (for Asset Repository and Incident Management), SharePoint and Remedy (for Change Management).

Experience in testing database applications of RDBMS in ORACLE, DB2, Teradata, SQL Server, Netezza and MS Access.

Have in-depth knowledge on Big Data technologies such as Hadoop query languages including Pig, Hive etc. along with alternative HDFS-based computing frameworks including Spark.

Good Communication Skills and Strong interpersonal skills to deal effectively with a broad range of contacts from technical staff, to clients, to management.

Exceptional focus, follow-through and coordination skills. Proven ability to manage project schedules, and known for working well with cross-functional teams to achieve on-time project completion.

Strong problem-solver who can design solutions and assist developers with issues.

Quick learner. Can pick up new concepts and technology very quickly and apply them effectively at work.

Experience working as a Technical Lead and Business Analyst in Telecom, Banking and Travel areas.

Strong Interpersonal and communication skills, ability to work in a team as well as independently with minimal supervision

Very methodical and well organized with excellent bottom-line for work

Sound experience in System analysis, design, and development using waterfall or agile methodology

Good understanding of design patterns

Motivated problem solver with strong analytical skills

A quick learner, punctual and trustworthy

A capable and resourceful team member who also possesses excellent written and verbal communication skills

EDUCATION

B.E. (Bachelor in Engineering) from Bengal Engineering College (Sibpur), India - 1997

SKILLS

ETL Tools

Ab Initio, Informatica, DataStage

Scripting languages

Korn shell, PL/SQL

Databases

Oracle, DB2 UDB, Teradata, SQL Server, Netezza

Languages

Packages

SQL, Visual Basic, Visual C# (C Sharp)

Oracle Forms and Reports, SQL*Loader, PL/SQL Developer

Operating Systems

UNIX, AIX, Windows 98/XP/NT, Windows NT 4.0, Windows 2000 and Sun Solaris

Other Softwares

Tableau, Microsoft Office, Microsoft Visio, Microsoft Project, Siebel CRM

Domain Knowledge

Telecom, Banking, Retail, Insurance and Maintenance Aviation

EXPERIENCES

April, 2019 – Present Technical Lead, Southwest Airlines, Dallas, TX, USA

The main goal of the Maintenance and Engineering (M&E) Mosaic Data Migration project is to migrate data correctly from the Legacy systems - TRAX and WIZARD to the industry leading aviation maintenance and engineering system IFS Maintenix™ system. The business benefit or the outcome of the project would be:

Ability to order and manage materials and tools, build and maintain the Southwest aircraft configuration, customize and manage the maintenance program, and author task cards to perform maintenance on the aircraft fleet using the new application.

Consolidate Airlines’ M&E applications.

Standardize M&E application to satisfy government standards for new aircraft.

Allow Airlines companies to decommission their legacy M&E systems (like Trax, Wizard)

Responsibilities:

Carry out a system ‘deep-dive’ review and analysis to develop source to target mapping documents which will be used to build data migration codes

Profile and analyze Source data from multiple strategic sources using the Ab Initio ETL Data Quality Environment (DQE) tool

Identify data migration themes and quality patterns presented in the data and be able to provide a summary supporting the empirical data findings using Tableau.

Implement business and data quality rules using Express>It framework.

Communicate and engage with Client Business and Client Vendor(s) Technical resources.

Define ETL Strategy

Lead the team technically

Develop Ab Initio code and required scripts that will extract data from legacy source systems, apply the business rules, transform into target format and load them into target systems.

Environment:

Ab Initio (GDE 3.3.4 Co>Operating System 3.3.4), Acquire>IT, Conduct>IT, MSSQL 9.0, UNIX Shell Script

October, 2018 – March, 2019 Technical Architect/Lead, Wells Fargo, Fremont, CA, USA

The primary purpose of the project is to ingest data from different source systems into Hadoop Enterprise Data Lake through Ab Initio data control tools which constitutes a set of processes, controls and discipline that enable optimum and effective management, consumption and distribution of data with an objective to ensure quality, secure and “fit for purpose” data is available for business decisions, analytics and reporting needs across Wells Fargo. Data control tool consists of 2 key components: a) Acquire>IT and b) Wrapper plans

Acquire>IT

oProvides for a GUI interface with pre-built templates for developers/analysts to configure data sources for ingestion into data lake/targets.

oAcquire>IT also has a Business Rule Environment where any custom mapping/business rules can be written

Wrapper Plans

oAb Initio Conduct>IT plans which provides for a workflow to run underlying Acquire>IT configurations and other graphs/scripts.

Responsibilities:

Develop code using Ab Initio GDE, Acquire>IT that will ingest data from different sources into Hadoop Data Lake.

Develop Wrapper Plans using Ab Initio Conduct>IT

Write SQL queries to extract data from SQL server

Develop generic Ab Initio graphs which will collect and publish both operational and technical metadata.

Profile and analyze source data from multiple strategic sources.

Develop UNIX shell wrapper scripts and supporting the testing effort.

Leading the team technically

Interact closely with Architects, data modelers and business analyst to understand the requirement.

Environment:

Ab Initio (GDE 3.2.5.1 Co>Operating System 3.2.5), Oracle (Version 11g), UNIX Shell Script

Feb, 2017 – September, 2018 Technical Lead, Southwest Airlines, Dallas, TX, USA

The main goal of the Maintenance and Engineering (M&E) Mosaic Data Migration project is to migrate data correctly from Legacy systems to Maintenix target application and setup the new application Maintenix which will provide the ability for Southwest Airlines to order and manage materials and tools, build and maintain the Southwest aircraft configuration, customize and manage the maintenance program, and author task cards to perform maintenance on the aircraft fleet using the new application.

Responsibilities:

Carry out a system ‘deep-dive’ review and analysis to develop source to target mapping documents which will be used to build data migration codes

Profile and analyze Source data from multiple strategic sources using the Ab Initio ETL Data Quality Environment (DQE) tool

Identify data migration themes and quality patterns presented in the data and be able to provide a summary supporting the empirical data findings using Tableau.

Implement business and data quality rules using Express>It framework.

Communicate and engage with Client Business and Client Vendor(s) Technical resources.

Define ETL Strategy

Lead the team technically

Develop Ab Initio code and required scripts that will extract data from legacy source systems, apply the business rules, transform into target format and load them into target systems.

Environment:

Ab Initio (GDE 3.3.4 Co>Operating System 3.3.4), Acquire>IT, Conduct>IT, MSSQL 9.0, UNIX Shell Script

March, 2014 – Jan, 2017 ETL Lead, Heartland Payment Systems, Plano, TX, USA

Project #1

The project targets to build an Operational Data Store (or "ODS") which is designed to integrate data from multiple sources for additional operations on the data like reports or data analysis. Data from different sources would be replicated into a single location using ETL process. The ETL programs would have the following main functionalities:

Be able to extract data from different kind of sources; structured data like RDBMS or unstructured data such as logs, recordings, recordings, documents etc.

Can be configured to replicate data on near real time basis like in every 5 minutes or in batch mode like in every 4 hours.

Should not lose any data while replicating

Project #2

The project is to build

“Fractals” is the fraud scoring platform used by the company’s Loss Prevention department. Fractals and

this accompanying ETL process are the company’s primary line of defense against transferring monies in fraudulent transactions – outage of this ETL process will immediately cripple Fractal’s ability to identify and act on suspicious payment activity, exposing the company to the risk of severe financial losses.

This ETL process provides Fractals with specific details of each payment card transaction serviced by the company, in near-real-time. This data stream includes all facts Fractals requires to judge the legitimacy of every transaction, including payment card particulars, dollar amounts, dates and times, terminal and batch information, processing and result codes, etc.

Responsibilities:

Analysis, Design and development of business requirements

Interact closely with Architects, data Modelers and Business Analyst during design sessions

Develop custom specific and generic graphs to load and unload data from Oracle/Netezza databases

Design, coding and writing and testing complex ETL application programs using Datastage and Ab Initio tool

Performance tuning of Ab Initio graphs processes and implement Ab Initio best practices throughout the development phase

Develop UNIX shell wrapper scripts and supporting the testing effort

Test and enhance the application by developing automated test scripts, product deployment, methodologies, application support and bug fixing.

Work with production support team to execute and monitor the production jobs through scheduler

On-call support

Guide and help other employees of the organization to learn Ab initio and its best practices.

Environment:

Datastage (version 8.5) Ab Initio (GDE 3.2.4.2 Co>Operating System 3.2.4), MSSQL 9.0, Oracle (Version 11g), UNIX Shell Script

June ‘12 – March ‘14 ETL Lead, Kaiser Permanente, Lake Oswego, OR, USA

This project aims to develop a new interface to build claims data from CDW (i.e. Claim Data Warehouse) to NPS (i.e. National Pricing Solutions) that would be more timely and auditable and can be leveraged for all regions. The high level tasks for these goals are:

Extract claims data from CDW and load into staging environment.

Transform claims into NPS claim headers with NPS specific structure and attributes.

Develop a process to integrate claims from existing utilization data sources not yet available in CDW.

Load claims from CDW-NPS Interface into the NPS Analytics parallel environment and compare with the current utilization being submitted to NPS.

Integrate the CDW-NPS claims file with NPS Membership, Benefits and Risk Score data.

Responsibilities:

Understand the requirements and develop code using Ab Initio, Oracle, Pl/sql in a data warehouse project

Lead the team technically and functionally with offshore-onshore model

Test the code and support system and user acceptance testing

Implement the software and provide post implementation support

Provide inputs in designing the software

Performance tuning

Environment:

Ab Initio (GDE 1.14 Co>Operating System 2.14), Oracle (Version 11g), UNIX Shell Script, PL/SQL

April ‘11 – May ‘12 Senior Consultant, Wells Fargo, Minneapolis, MN, USA

SHARE

The objective of the SHARE project is to provide access to timely and complete internal customer data related to high-risk accounts across all LOBs. This data has the potential to be used by all areas of the credit cycle (acquisition, account management, collections) to help reduce credit losses and proactively prevent booking non-profitable accounts.

Responsibilities:

Developing ETL code using IBM Datastage

Also involved mainly in converting the existing code already built in Datastage into Ab Intio

Leading the team technically with a team size of 10

Talk to the business people to understand the requirements and map them to functional and technical designs.

Performance tuning

Environment:

Ab Initio (GDE 3.0.3, Co>Operating System 3.0.2.0), Teradata (Version 13.0), UNIX AIX, Autosys, Clearcase, Harvest, DataStage (Version 8.1)

Dec ‘08 – March ‘11 Technical Architect, American Express, Phoenix, AZ, USA & Sydney, Australia

Global Fee Generation & Client Reporting

American Express Business Travel currently uses different finance systems, Fee Allocator, IBS and CS3, to calculate client fees. Each system maintains its own client billing profile which results in reconciliation issues as the profiles do not consistently align. There is a history of instability and inflexibility with some of these systems causing revenue leaks which significantly impacts the business.

To address these issues, this project will provide;

A single global client profile which reflects the product fees and services fees as laid out in the client contract.

An event driven fee generation engine that will generate a fee according to the client profile for each event.

Global client profile to drive fee generation for all products and services that are not initiated by an event

A monthly set of both summary and fully itemized client reports detailing the fees generated.

Responsibilities:

Lead the team technically and functionally with offshore-onshore model

Talk to the business people to understand the requirements and map them to functional and technical designs.

Performance tuning

Lead developer of the back hand job of fee calculation on Ab initio (continuous flow mode) and several import export jobs (batch mode)

Involved in every phase starting from requirement gathering till implementation and production support

Environment:

Ab Initio (GDE 1.15.7, Co>Operating System 2.15.1), DB2-udb (version 8.2.9), UNIX AIX, .Net, Visual C#

May ‘07 – Nov ‘08 Technical Lead, British Telecom Plc, Glasgow, UK

TSR Migration

It is a huge migration project aiming to migrate a total of 30 million BT customers from legacy stack to new stack systems. The scope of the migration process is to support migration of BT Retail (i.e. it does not cover requirements around migrating Openreach, Global Services and Wholesale) data from legacy system i.e. CSS onto the Matrix OSS new stack system (Seibel CRM, Geneva Billing) and to convert the account to a standard WLR3 product on the EOI Openreach system.

Responsibilities:

Understand the business requirements; provide the best solution design to meet the requirements. The whole program is using agile model in every stage of it. The E2E design then ends up with several user stories for required components to deliver the required functionalities.

Provide the E2E approach for migration like where to find the source data, what systems are being impacted due to this migration, what should be interface between different systems etc.

Provide required design inputs to Migration controller ETL that also includes product mapping between old stack and new stack system

Also leading one component (a part of whole migration program) that loads data into one of the target systems using Ab Initio (a team of 10 members are there based in offshore India)

Provide technical help including performance tuning on Ab Initio

Also doing data analysis sometimes like how many customers are there in old stack that have PSTN and Broadband but no Broadband talk

Environment:

Ab Initio (GDE 1.14, Co>Operating System 2.14), Oracle 10g, UNIX AIX, Siebel CRM

June ‘04 – April ‘07 Technical Lead, Deployment Manager and Development Manager, British Telecom Plc, Glasgow, UK

Single Information Model

The project aims preparing a Single Information Model (SIM) for British Telecom Retail's customer data. It is part of Master data Management (MDM) System. This repository would include data for domestic consumers and business customers. The data would be fetched from several sources in BT, and would comprise the full range of entities like billing accounts, addresses, sites, assets, contacts, consumers etc. The data loaded onto SIM will then be used to feed two databases, OneSiebel and OneView, served by Siebel GUIs. These databases would contain data for business and domestic customers respectively.

Responsibilities:

Involved in this project starting from team build to implementing it in production on the following roles:

June, 2004 – March 2006: Team building, lead designer, lead developer, technical lead

Started this project from scratch at offshore, built the team

Provided solution design for different entities like contact, consumer, billing account and address

Developed a complicated code in Ab initio, pl/sql & UNIX shell script for a process called contact matching (records for a contact person are received from different sources and this code will try to match them using a set of rules and create a single contact for them)

Lead a team of 20 developers

Provided technical help on Ab initio and pl/sql

March, 2006 – January 2007: Deployment Manager

Responsible for successful implementation of each release of the project in production and meet the deadline

Plan all deployment activities and the back out plan in case need to revert back in advance

Handle all sort of production issues after deployment

Lead the ASG team and the deployment team

February, 2007 – April 2007: Development Assurance

Lead the development team of 30 people at offshore under the whole program

Ensure quality of codes delivered by the development team

Help the team from both design and technical point of view

Environment:

Ab Initio (GDE 1.13, Co>Operating System 2.13), Oracle 9i, UNIX AIX, shell scripting, pl/sql, Siebel CRM

July ‘02 – May ‘04 Lead Developer, Designer, Tester, British Telecom Plc, Glasgow, UK

Customer Data Management Centre

The project CDMC (Customer Data Management Centre) system used to provide four main services:

Data Cleanse: CDMC performs data cleanse of customer structures on CSS and COSMOSS on behalf of Global Joint Venture (Concert), CFB (Customer Focused Billing) and Adder (customer commitment discount) programs. Each of these programs also uses the CDMC cleanse activity to provide a complete identification of the customer for evaluation purposes.

Customer Reference Set: CDMC maps legacy system Customer Identifying Data Instances (currently CSS billing accounts and COSMOSS customers) to the owning Legal Entity. These data maps incorporating the LE and organizational hierarchies from PartyID and C-CAT are known as the reference set.

Data Extracts: CDMC augments feeds of data from the legacy systems with the Customer Reference Set and provides nightly data extracts to various BT systems. The Siebel Adaptor process is an enhanced version of this extract process, which extracts data from CDMC database, transforms that data into Siebel EIM format to load into EIM tables and calls the EIM process to load that EIM data into Siebel base tables.

Web applications: CDMC provides three Web tools for browsing data held on CDMC: CDMC reports, Showbiz & IV.

Responsibilities:

Started as a tester at client location, write the test specifications and raise any bug if found during testing and document the test results

Then Developed several codes in Ab Initio, pl/sql, UNIX shell script, became lead developer

Provided design for different components and then high level design at onshore (coding done from offshore India)

Environment:

Ab Initio (GDE 1.12, Co>Operating System 2.12), Oracle 9i, UNIX AIX, pl/sql, Java

March ‘01 – June ‘02 Senior Developer, McGraw-Hill, New Jersey, USA & Kolkata, India

Horizon

McGraw-Hill Companies is a global publishing, financial, information and media services company with 16,500 employees located in 400 offices in more than 32 countries. The company provides information via various media platforms through books, magazines and newsletters; online over the internet and electronic networks; via television, satellite and FM side band broadcast; and through software, videotape and CD-ROM products.

The McGraw-Hill Construction Information Group has identified two projects to be executed by TCS. These are:

Development of “Order Entry and Fulfillment” System. This effectively will replace the existing DCIS application (legacy VAX/VMS, PL1).

In continuing with the development of Order Entry and Fulfillment, McGraw-Hill needs the entire suite of application in DCIS evaluated and broken into its component pieces.

The project has been customized and deployed in Web server architecture.

Responsibilities:

Developed codes using Oracle forms, reports & pl/sql (functions and procedures)

Designed components

IQA testing (Unit and integration)

Prototype Development

Lead one module in the project

Environment:

Developer-2000 (Forms 6I, Reports 6I), Oracle PL/SQL, SQL*PLUS, Solaris, Windows NT/2K

Oct ‘00 – Feb ‘01 Senior Developer, Prime Response, Kolkata, India

Prime Response

This project deals with Unicode conversion of a CRM called (adhn0h@r.postjobfree.com), developed by Prime Response, UK. Their existing software has support for European languages like German, French, Spanish and Italian apart from English. Prime Response approached TCS to formulate a strategy for Internationalization of their CRM, Marketing Automation suite (adhn0h@r.postjobfree.com). The project architecture involves a Front End (in VB), a Middle tier (RPC App. Server written in C and running on AIX/HP-UX/Solaris/NT) and a database. A Web Decision Engine (WDE) is there (written in JAVA) to provide support for B2B/WAP and SMS. The projects activities include writing conversion routines, make necessary changes in a SQL parser for handling Unicode data (Using Lex/Yacc), changing the front-end code for proper display of data etc. Special attention has been given to the Localization issues (providing support for Japanese data formats, Currency and other regional settings).

Responsibilities:

Involved in developing codes that aims to convert the existing source codes in VB, VC++, C and JAVA to make them UNICODE enabled according to the strategy.

Unit testing

Environment:

Visual Basic, C, Java, Solaris

Aug ‘00 – Sep ‘00 Lead Developer, Newspaper Industry of Dubai, Kolkata, India

Newspaper Circulation System

The Newspaper Circulation System is a web-enabled system for a reputed Newspaper Industry of Dubai, particularly for maintaining the accounts for newspaper circulation to subscribers, retail sellers and street sellers. The main object is distributed newspaper and the amount involved has to be accounted properly and invoices to be prepared automatically.

Responsibilities:

Lead a team of 5 developers

Lead developer

Designing Database in SQL Server

Unit Testing

Environment:

Java 1.2, MS SQL Server 6.5

Jun ‘99 – Jul ‘00 Lead Developer, Coates of India Limited, Kolkata, India

Employee Management System

The Employee Management System is a system which encompasses all the areas related to the human resources an payroll. The system captures the complete information pertaining to an employee at the time of joining i.e. personnel data and salary data and it onwards the complete history of an employee during the tenure of his/her service is recorded by the system and is made online. The salary and income tax is also processed by the system.

Responsibilities:

Code development

Database Maintenance

Unit Testing

Implementation

Environment:

Visual Basic 6.0, MS SQL Server 6.5, Windows NT

May ‘98 – May ‘99 Developer, Coates of India Limited, Kolkata, India

Sales Management System

The Sales Management System encompasses all the areas related to sales activity. It collects, maintains and processes different types of data and documents like order, indent, collection, excise, octroi etc. It generates all the statutory reports like debtors, outstanding excise register and various MIS reports.

The SMS system is comprised of the following functions

Maintenance of Customer and Product Master

Order / Indent Maintenance

Generation of Invoice / Debit Note / Credit Note

Collection Tracking

Profitability Analysis

Responsibilities:

Code development

Database Maintenance

Unit Testing

Implementation

User Training and User Support

Environment:

Visual Basic 5.0, MS SQL Server 6.5, Windows NT

Jun ‘97 – Apr ‘98 Developer, Coates of India Limited, Kolkata, India

Inventory Management System

The Inventory Management System is designed for efficient inventory management with the aid of its interface to the PPC and SMS system. It intends to maintain the inventory at an optimum level so as to eliminate unnecessary inventory overheads. This module handles several purchase and other related transaction and generate various MIS reports. This system is comprised of the following functions:

Maintenance of Vendor / Product Information

Indigenous Purchase

Bill Processing

Product Costing

Responsibilities:

Code development

Database Maintenance

Unit Testing

Implementation

User Training and User Support

Environment:

Visual Basic 5.0, MS SQL Server 6.5, Windows NT



Contact this candidate