Post Job Free

Resume

Sign in

Data Design

Location:
Wheeling, IL
Posted:
April 20, 2020

Contact this candidate

Resume:

Niraj Kumar Pandey 469-***-****

Sr. Ab initio Developer/ETL Lead adcvnv@r.postjobfree.com

Professional Summary:

10+ Years of IT experience specializing in Analysis, Design and Development of ETL processes in all phases of the Data Warehousing life cycle.

9+ Years of relevant experience in Data warehousing with strong emphasis on Ab Initio.

Extensive experience in Ab Initio and Co>Operating System with EME as a Metadata Environment and Tivoli (Maestro), Auto Sys, CA7 and Control M as a Scheduler.

Approx. 4+ years of experience in Teradata.

Efficient in all phases of the development lifecycle, coherent with data cleansing, data conversion, performance tuning and system testing.

Thorough Understanding of Ab Initio ACE and BRE.

Thorough understanding of TD architecture, Teradata specific SQL.

Efficient in incorporation of various data sources such as Teradata, Oracle, Sql Server and Flat files into the staging area.

Involved in coordinating calls with source systems, clients and with other business partners.

Good experience is maintaining the versioning and deployment instruction documents while moving the code to various environments using HPSM.

Extensively worked with production support team to resolve issues and worked in extracting data from Database tables and flat files.

Performance tuning by using phasing, appropriate filters and reformats, joins and lookups etc in Ab Initio.

Hands on Experience in Teradata, UNIX, Shell Scripting, SQl Server, Data Relationships, Data extraction & Validation.

Good exposure to development, testing, debugging, implementation, documentation, user training & production support.

Resolving data issues, complete unit testing and complete system documentation for ETL processes.

Experience on Creating detailed project outlines and application design specifications.

Involved extensively in GUI, System and Regression Testing.

Proven track record in troubleshooting of Ab Initio jobs and addressing production issues like performance tuning and enhancement.

Excellent knowledge of studying the data dependencies using Metadata stored in the EME (Enterprise Meta> Environment) Repository.

Strong understanding of business processes with excellent writing and documentation skills for the management and development.

Excellent track record as a leader and team player with effective communication skills.

Substantial ability to employ knowledge-based solutions for industry tribulations.

Working on Agile Methodology and participating in all the ceremonies.

Received accolades from client for Working on complex stories and coming up with great idea’s for solving problems and encouraging others in the team

Received much appreciation from Client and Customers for continuous commitment and dedication for working on complex stories and completing them within the short span of timeline.

Exposure to Talend DI as part of PoC.

Good understanding of Cloud Computing, Big Data, AWS and NoSQL Database.

Technical Skills:

Operating System

Windows, DOS,Z/OS, Unix, Linux

Languages

C, C++, Unix, Shell Scripting, SQL, Java

Database

DB2, SQL Server,Oracle, Netezza, Teradata, Snowflake, Hive, Mongo DB

Tools

AB Initio, Talend, Big Data, Hadoop, AWS, UDL, Kibana

Scheduling Tools

CA7,Control-M,Autosys,Maestro(Trivoli)

Task Tracking Tools

JIRA, Confluence, Version One, Rally

Education

Degree Bachelor of Engineering from Visvesvaraya Technological University.

Specialization Computer Science & Engineering.

Year 2004 - 2008

Degree Aggregate :72%

Projects:

Client: Discover Riverwoods, USA Aug’19– Till Date

Role: Ab Initio Technical Lead

Project Description: FODS Migration

FODS Migration is a Data warehousing project to mitigate the existing DataStage process to Ab initio Process and load the data in Cloud. The data get stored in Teradata tables and in Cloud using AWS S3 through UDL utilities into Snowflake. The end users use these data for the analysis and report purpose. They use these data to get the transactional details associated with the Discover customers.

Responsibilities:

Requirement analysis and Design Discussions.

High level and detailed level Design documentation.

Involved in creating detail data flows with source and target mappings and convert data requirements into low level design templates.

Understanding the existing process and building them to run through the ABI Process.

Validating the files generated from the new process and existing process. Comparing the data at record level to understand the difference and validate them.

Promoting the codes from DEV to PA/PROD using live queue.

Create Datastore and Triggers associated to them in the Datahub.

Involved in UAT and Prod Parallel run data validation with the Business.

Involved in performing Unit Testing in DEV and integration testing in PA environment, prepare Unit test case document, code review document, ETL Design document.

Responsible for Daily production job monitoring.

Involved in identification and resolution of issues by coordinating with various project teams.

Leading the team and providing better solutions.

Extensively involved in Ab Initio Graph Design, development and Performance tuning.

Involved in building the table in UDL and building utilities to load the AWS S3 files to snowflake.

Designing Maestro JOBS to schedule them to run the new Process.

Leading the team and providing better solutions.

Working closely with the associates to maintain the timely deliveries, interaction with the source teams to provide them with requirements.

Environment: Ab Initio GDE, Oracle, Unix,Maestro, AWS S3, UDL, Kibana,Snow Flake.

Client: Travelers, USA Jan’19– July’19

Role: Ab Initio/ Talend Technical Lead

Project Description: Travelers Customer 360

The Customer 360 Program is an initiative at enterprise level to provide a holistic view of the single customer which is shared through Customer 360 UI dashboard built on Angular Js accessing data from MongoDB. The Prime users of this dashboard are Travelers IR’s serving Travelers customers. Being Customer centric at most care in planning and development is taken.

Responsibilities:

Requirement analysis and Design Discussions.

High level and detailed level Design documentation.

Involved in creating detail data flows with source and target mappings and convert data requirements into low level design templates.

Understanding business requirements, Requirement gathering, Implementation of business requirement using Ab Initio/Talend, Teradata, Oracle, Sql Server, MongoDB and Unix shell scripts.

Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs.

Involved in performing Unit Testing in DEV and integration testing in CAT environment, prepare Unit test case document, code review document, ETL Design document.

Responsible for Daily production job monitoring, Production support and Maintenance.

Responsible for validation of data loads from source to staging area, transformation logic to transform data and load from staging to target area.

Involved in identification and resolution of defects by coordinating with various project teams.

Leading the team and providing better solutions.

Extensively involved in Ab Initio / Talend Graph Design, development and Performance tuning.

Migrated scripts from DEV to CAT and UAT environment to test & validate data for Testers.

Designing Autosys JOBS to schedule the job based on business requirements.

Leading the team and providing better solutions.

Working closely with the associates to maintain the timely deliveries, interaction with the source teams to provide them with requirements.

Environment: Ab Initio GDE/ Talend, Oracle, Teradata, Unix, SQL-Server, AutoSys, MongoDB

Client: First Data, USA Feb’17 – Dec’18

Role: Ab Initio Technical Lead

Project Description: First Data (Data Vault/IDW/IRAP)

The First Data IDW team supports the Data Warehouse projects.

First Data IDW is the Integrated Data Warehouse. It is the single largest Data warehouse at First data.

It feeds reporting system as well as extracts for alliances and vendors.

The IDW team supports the different Merchandise and their outlet. We kept the details of all the Customers and provide details data to the business on their need. IDW team share the data with different sources and provide feasible information to various vendors.

IRAP (Integrated Reporting and Analysis Platform) support the level of information or views required to effectively support the business needs. The purpose is to build out shared reporting and analytic tools environment that provides self-service reporting and analytic capabilities to the business users. We support different Sources like SALES FORCE,TELECHECK,PREPAID OPEN LOOP,PREPAID CLOSED LOOP,INFOLEASE,EMAX,BAMS,CABS,POS,NETMAN,MANNUAL BILLING,OMAHA BUY RATE,TASQ,MAPS

Responsibilities:

Requirement analysis and Design Discussions.

High level and detailed level Design documentation.

Involved in creating detail data flows with source and target mappings and convert data requirements into low level design templates.

Understanding business requirements, Requirement gathering, Implementation of business requirement using Ab Initio, DB2, Netezza, Unix shell scripts.

Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs.

Involved in performing Unit Testing in DEV and integration testing in CAT environment, prepare Unit test case document, code review document, ETL Design document.

Responsible for Daily production job monitoring, Production support and Maintenance.

Responsible for validation of data loads from source to staging area, transformation logic to transform data and load from staging to target area.

Involved in identification and resolution of defects by coordinating with various project teams.

Leading the team and providing better solutions.

Extensively involved in Ab Initio Graph Design, development and Performance tuning.

Migrated scripts from DEV to CAT and UAT environment to test & validate data for Testers.

Designing CA7 JOBS to schedule the job based on business requirements.

Leading the team and providing better solutions.

Working closely with the associates to maintain the timely deliveries, interaction with the source teams to provide them with requirements.

Environment: Ab Initio GDE, Co op, Oracle, Netezza, Unix, SQL-Server, CA7,DB2

Client: WellsFargo (MCL), USA Sep’16-Feb’17

Role: Ab Initio Technical Lead

Project Description:

MCL (Multi Channel Leads) is a centralized offer rules engine and channel router that intelligently matches leads, offers, and sales channels. Different LOBs provide the leads which are to be processed through the MCL and delivered to the specified channel.

Lead lists comprise of two files – Lead file and Control file.

These lead lists can be created for –

Processing new leads

Refreshing existing leads in MCL repository

Closing the specified leads

Responsibilities:

Requirement analysis and Design Discussions.

High level and detailed level Design documentation.

Involved in creating detail data flows with source and target mappings and convert data requirements into low level design templates.

Developing graph,psets,xfr’s,dmls using Ab Initio according to DLD and HLD.

Developed highly generic graphs and Plans to serve the instant requests from the business.

Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs.

Extensively involved in Ab Initio Graph Design, development and Performance tuning.

Migrated scripts from DEV to SIT and UAT environment to test & validate data for Testers.

Designing AutoSys JOBS to schedule the job based on business requirements.

Leading the team and providing better solutions.

Attending Advance Ab Iniio training organized by Wellsfargo from Ab Initio Corp.

Environment: Ab Initio GDE, Co op, Oracle, Teradata, Unix, SQL-Server, AutoSys.

Client: Capital One, Plano TX Jan’13 – Jun’16

Role: Sr. Developer / ETL Lead

Project Description:

To populate transition data daily from auto finance department in to data warehouse and loading different source servers data with proper business meaning to provide better service to end customer in part of analytical, operational and legal purposes. Maintaining the existing process and deriving new process to add new functionalities to the business.

Gives various presentation layers to provide existing loan, dealers, offers, activities, events and mortgage details to front desk in a single window respective to each individual will help people to resolve issues and respective managers can take decisions accordingly to improve business in good aspiration. Getting data from different third part venders to have a good view in business status.

In part of Auto Data warehousing project resolving upcoming data issues and providing better resolution to the wrong accordingly, that will give to business managers to get a better understanding on the auto loan status. Will give single view in part of customer view, that will leads customer relationships in better way.

Responsibilities:

Requirement analysis and Design Discussions.

High level and detailed level Design documentation.

Involved in creating detail data flows with source and target mappings and convert data requirements into low level design templates.

Developing scripts,graph,psets,xfr’s,dmls using Ab Initio according to DLD and HLD.

Developed highly generic graphs and Plans to serve the instant requests from the business.

Developed Reconcilation Scripts to trigger mail to Business for job status.

Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs.

Worked with Datasets, Lookup, Merge, Join, Filters, Transform, Partitioning components, Rollup and Database components etc.

Extensively involved in Ab Initio Graph Design, development and Performance tuning.

Sitting with Business to discuss various requirements and providing them appropriate solution.

Migrated scripts from DEV to SIT and UAT environment to test & validate data for Testers.

Designing Control-m JOBS to schedule the job based on business requirements.

Worked with Infrastructure team to write some custom shell scripts to serve the daily needs like collecting logs and cleaning up data.

Uses ACE and BRE for some of stories to meet business requirement.

Developed many reusable Unix scripts and Ab Initio graphs for reducing the development effort.

Documentation for Production Implementation and Design review with L3/IPS team.

Leading Offshore team from Onsite.

Involved in the Knowledge transfer of the project to new team members and freshers.

Environment: Ab Initio GDE, Co op, Oracle, Teradata, Unix, SQL-Server, Control-M Scheduler.

Client: KeyBank US Oct’ 11– Sept’12

Role: Sr. Ab initio Developer

Project Description:

The objective of the project is to create a set of tools that will be able to provide meaningful, depersonalized test data to application test environments. It will utilize the Ab Initio co-operating system. It will be configured to disguise at minimum the following types of input/output:

Mainframe GDG/VSAM, Unix file, Teradata, Sql server database.

Responsibilities:

To generate graph and create pset, xfr and dml(conditional and Unconditional) to handle fixed length and variable length mainframe (MVS and VSAM) files based on the copybook provided in the Data gathering sheet.

To generate graph and to create pset, xfr and Masking function Sheet to perform Data Disguisement.

To create new graphs and modify the existing ones based on the requirement.

Responsible for creation generic PSETs and wrapper script.

Worked with Datasets, Lookup, Merge, Join, Filters, Transform, Partitioning components, Rollup and Database components etc.

Responsible for Creation and Purging of flat and MFS files developed in the process.

To create the wrapper to execute the Masking function sheet to apply masking function

And also to create wrapper to call pset to execute the graph.

Prepared Unit test cases and UAT test cases.

Involved in the creation of mapping documents.

To create TWS script in Mainframe to run the job.

To design the implementation plan for SIT & UIT.

Interacted with Customers/User Groups/Corporate Testing Groups/Business people.

Environment: Ab initio, UNIX, WinScp, Mainframe, Teradata, Tivoli Work scheduler

Client: Vodafone UK (Customer Centricity) Jan’ 11 – Oct’11

Role: Ab Initio Consultant

Project Description:

The Vodafone Business Intelligence Customer Centricity Project is a multi phase delivery project to enhance the capability of the Vodafone UK Enterprise Data Warehouse (EDW) in order to provide the business with improved insight and reporting on Vodafone’s Customers. It also aims to reduce Opex Costs (estimated £400K a year) by consolidating its current set of suppliers that delivers data cleansing and matching.

The key objectives of the project are:

Enable the business to have improved Customer insight using an agnostic approach to modeling the customer within the EDW.

Replace the current supplier (Acxiom) with Experian as the new supplier for name and address cleansing of Consumer data

Replace current supplier (InfoArts) with pH Group (working closely together with Experian) for name and address cleansing of Enterprise data

To integrate cleansed customer data with downstream interfaces.

Consolidate separate Customer Data models (ADS, Customer Key SCM EDW) into an agnostic scalable data model that integrates with the Core EDW Analysis.

Responsibilities:

• Involved in developing totally metadata driven architecture.

• Involved in creating detail data flows with source and target mappings and convert data requirements into low level design templates.

• Developed highly generic graphs to serve the instant requests from the business.

• Worked in highly parallelized (MPP Solution) Ab initio environment to process 1+Tera bytes of data daily.

• Developed complicated graphs using various Ab Initio components such as Join, Rollup,Lookup, Partition by Key, Round Robin, Gather, Merge, Dedup sorted, Scan and Validate.

• Worked with Infrastructure team to write some custom shell scripts to serve the daily needs like collecting logs and cleaning up data.

• Migrated scripts from DEV to SIT and UAT environment to test & validate data.

• Deployed and test run graphs as executable k-shell scripts in the application system.

• Worked with project team in optimizing and tuning of SQL statements, used phases/checkpoints to avoid deadlocks to improve the efficiency.

• Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs.

• Extensively involved in Ab Initio Graph Design, development and Performance tuning.

• Worked with cron tabs to schedule the reports.

• Involved in writing processes to continuously capture the data from different servers across the country continuously.

• Used UNIX environment variables in All the Ab Initio Graphs, which comprises of specified locations of the files to be used as source and target.

• Used sandbox parameters to check in and checkout of graphs from repository Systems.

Environment: Ab Initio, Teradata – BTEQ, Unix, Trivoli Work Scheduler

Client: Vodafone UK (Accord Chordiant) Feb’10 – Jan’11

Role: Ab Initio Developer

Project Description:

The main objective of Vodafone UK is to offer deals (special tariffs) to its customers based on their usage.

The basic process will be to extract a driving list of MSISDN’s and Registrations from the BAG databases

and then use that to drive extracts from other areas of the EDW. The data will be persisted into intermediate

tables which will have one for one base views over them. The data will be exposed to the Chordiant

transfer process via another set of views which will permit derived data to be included from within the table.

BAG (Business Aggregates) is the product ionised prior for the implementation of this project. BAG is an

already existing database having details of customer’s usage summary. For Chordiant, BAG tables were source.

Apply aggregates on BAG tables and get the usage summary on his voice, data, internet, roaming and etc.

For each customer for last 3 months, 6 months & 12 months and based on his usage and revenue generation

over a period of time, Vodafone UK would offer him a special tariff.

The initial base extract will ensure that both Registration and MSISDN are unique and either can be therefore

be used to identify a single row.

Strictly speaking the data is presented at Registration level, but Registration and MSIDSN are synonymous

at a given point in time, which in this case is the extract run date.

To get this aggregated usage summary, Chordiant phase 2 was launched.

Responsibilities:

• Involved in developing totally metadata driven architecture.

• Involved in creating detail data flows with source and target mappings and convert data

Requirement into low level design templates.

• Developed highly generic graphs to serve the instant requests from the business.

• Worked in highly parallelized (MPP Solution) Ab initio environment to process 1+Tera bytes of data daily.

• Developed complicated graphs using various Ab Initio components such as Join, Rollup,Lookup, Partition by Key, Round Robin, Gather, Merge, Dedup sorted, Scan and Validate.

• Worked with Infrastructure team to write some custom shell scripts to serve the daily needs like collecting logs and cleaning up data.

• Migrated scripts from DEV to SIT and UAT environment to test & validate data.

• Deployed and test run graphs as executable k-shell scripts in the application system.

• Worked with project team in optimizing and tuning of SQL statements, used phases/checkpoints to avoid deadlocks to improve the efficiency.

• Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs.

• Extensively involved in Ab Initio Graph Design, development and Performance tuning.

• Worked with cron tabs to schedule the reports.

• Involved in writing processes to continuously capture the data from different servers across the country continuously.

• Used UNIX environment variables in All the Ab Initio Graphs, which comprises of specified locations of the files to be used as source and target.

• Used sandbox parameters to check in and checkout of graphs from repository Systems.

Environment: Ab Initio, Teradata – BTEQ, Unix, Trivoli Work Scheduler



Contact this candidate