Resume

Sign in

Teradata, Informatica, ETL Developer

Location:
Hyderabad, Telangana, India
Posted:
January 05, 2018

Contact this candidate

Sougata Khatua

Teradata Developer, Informatica/Ab initio ETL Developer, Data Analyst

Contact: 984******* / 912-***-****

Email: ac3xfx@r.postjobfree.com

Experience Summary

6+ Years of professional experience in IT industry catering to various clients.

Experienced in the areas of data analysis, design, Informatica and Teradata development, performance tuning, Upgrading and Development including Requirement Gathering, Analysis, Configuration, Migration and deployment of Data warehouse project using Fact Tables, Dimension Tables, Star and Snowflake schema modeling.

Has adequate knowledge in ETL process, performance tuning, Defect analysis, Root Cause analysis for data / design / code issues and its resolution with permanent fix.

Experienced in implementing Slowly Changing Dimensions (both Type 1 and 2)

Experienced in basic UNIX and Shell scripting.

Having sound knowledge in Teradata Architecture

Having knowledge in Teradata architecture and Utilities (BTEQ, FastLoad, MultiLoad, FastExport, TPT)

Having knowledge of the best practices for Informatica PDO (Push down optimization) for Teradata.

Having experience in creating mapping data flow using various transformation like expression, sorter, filter, joiner, connected and unconnected lookup, update strategy, xml parser etc.

Having experience in implementing JSON file read transformation using expression transformation.

Having sound knowledge on IBM’s BDW (Banking Data Warehouse) model.

Exposure in Data migration projects involving analysis, reverse engineering in as-is model and to-be model building

Used the Control-M Automation (CA8) extensively to run ETL jobs, schedule, monitor, debug and test the application in development, QA & UAT environment and to obtain the performance statistics.

Having experience using ITSM (IT Service Management) tool for code migration to higher environments.

Having experience using JIRA tool for issue tracking purpose.

Having experience working in Agile model.

Effective in working with various business stakeholders, Support Managers, Senior Project Managers, Project Managers and Solution Architects

Currently working in servicing Online & Mobile (OLM) transaction & operation details for OLM data mart.

Educational Qualifications

Title of the Degree with Branch

College/University

Master of Science in Computer Science

VIT University, Vellore, Tamil Nadu, India

Technical Skills

ETL Tool

Informatica 9.1, Ab Initio (Trained)

Schedulers

Control-M

Database Tools

Teradata 13 and 14

Scripting

Shell, Basic Perl

Version Control

CVS and SVN

Migration Tool

ITSM (IT Service Management)

Relevant Project Experience

Project # 1

Project Name: Channel - OLM

Duration

42 Months

Technology

Informatica, Control – M scheduler, Teradata 14, Unix Shell

Project Abstract

Channel – OLM (Online & Mobile)

It is a three layer architecture – Staging, Integration and semantic. Files from the Digital formerly CIG (Chase Internet Group) system of records (SOR) will be directly loaded in load tables of staging layer. From staging, data will be moved to integration layer with applied transformation logic on data with other additional audit fields. Integration data will be further moved to semantic layer with all customized transformation logic according to business needs. OLM Mart maintains all transaction done through chase.com or chase mobile app.

Role Senior ETL Developer and Data Analyst

Responsibilities Responsible for analysis and understanding of functional requirement specification documents and Understanding existing business model and customer requirements.

Enhancement and development of Informatica workflows, mappings, and sessions based on ETL Specification

Involved in Requirement Analysis, Effort Estimation, Impact Analysis, Code designing, Developing, Deployment of ETL components

Preparation of Test plans and test cases, raising issues/challenges, defect tracking, reviewing the status of the deliverables and migration of objects from lower to upper environment

Preparation Technical Specification Document (TSD)

Production break fix and end user defect analysis and solution

Co-ordinate with onsite team for validating the migrated attributes in ICDW platform.

Performed Regression Testing and mismatch Analysis and also performed Defect Analysis to find the root causes of these issues.

Developing new job/job group in Control-M scheduler.

Responsible for implementation and deployment using ITSM process

Responsible for tracking issues using JIRA tool and resolving them on time.

Reporting Project status and incidents status to client management group

Involved in discussions of environment challenges and approach to setup new environment

Project # 2

Project Name: Integrated Deliver Network’s (IDN) ETL Migration

Duration

9 Months

Technology

Teradata 14, Event Engine, MDF (Metadata Driven Framework), DDC (Data Development Checklist)

Project Abstract

IDN ETL Migration – It is migration project from Ab initio and Sybase based platform to Teradata based platform. ETL tool is built with Amex’s proprietary tool called Metadata driven Framework (MDF) with the help of Teradata to achieve the primary goal “Feed Per week”.

Files will be SFTPed from the mainframe system and will be loaded into data landing zone and then data will be transformed and loaded into mart layer.

Role ETL Developer

Responsibilities

Source File Analysis

Abinitio Code Analysis

Attribute Mapping

Peer Review

Coding (Stored Procedure in Teradata 14 and BTEQ Scripts in Unix)

Data Analysis and profiling

Performance Tuning

Ensure adherence to quality processes

Involved in Requirement Analysis, Effort Estimation, Impact Analysis, Code designing, Developing, Deployment of ETL components

Preparation of Test plans and test cases, raising issues/challenges, defect tracking, reviewing the status of the deliverables and migration of objects from lower to upper environment

Performed Regression Testing and mismatch Analysis and also performed Defect Analysis to find the root causes of these issues.

Responsible for complete implementation and deployment

Reporting Project status and incidents status to client management group

Involved in discussions of environment challenges and approach to setup new environment

Project # 3

Project Name: Integrated Deliver Network’s (IDN) Data Governance Scorecard

Duration

6 Months

Technology

Teradata 14

Project Abstract

IDN Data Governance Scorecard – It is a data governance project which requires to build scorecard for last month, last 3 months, last 6 months and one year about data quality, number of jobs failed in production, about financial data and vendors of Amex and platform uptime and downtime.

Role ETL Developer

Responsibilities

Coding (Stored Procedure in Teradata 14 and BTEQ Scripts in Unix)

Data Analysis and profiling

Performance Tuning

Ensure adherence to quality processes

Involved in Requirement Analysis, Effort Estimation, Impact Analysis, Code designing, Developing, Deployment of ETL components

Preparation of Test plans and test cases, raising issues/challenges, defect tracking, reviewing the status of the deliverables and migration of objects from lower to upper environment

Performed Regression Testing and mismatch Analysis and also performed Defect Analysis to find the root causes of these issues.

Responsible for complete implementation and deployment

Reporting Project status and incidents status to client management group

Project # 4

Project Name: Clinical Integration Platform (CIP)

Duration

13 Months

Technology

Teradata 13, Informatica 9, Unix Shell Script, Work Load Manager as scheduler

Project Abstract

Clinical Integration Platform – It is a data migration project which objective is To migrate the data from the different regional warehouses to the one centralized enter prized data warehouse called “EdWARD (Enterprise Data Warehouse And Research Depot)” to maintain single version of truth.

Data will be pulled from regional legacy data warehouse and using Teradata Fast Export utility and loaded into a into load ready file. While writing into the file, all the transformations and data cleansing activities are performed. From the load ready file, data is loaded into data landing zone using Teradata Fast Load utility and from there, data is being loaded into mart layer using Informatica ODBC connection.

Role ETL Developer

Responsibilities

Coding (Query in Teradata, FastLoad, FastExport and BTEQ Script in Unix)

Data Analysis

Performance Tuning Ensure adherence to quality processes

Involved in Requirement Analysis, Effort Estimation, Impact Analysis, Code designing, Developing, Deployment of ETL components

Preparation of Test plans and test cases, raising issues/challenges, defect tracking, reviewing the status of the deliverables and migration of objects from lower to upper environment

Performed Regression Testing and mismatch Analysis and also performed Defect Analysis to find the root causes of these issues.

Responsible for complete implementation and deployment

Reporting Project status and incidents status to client management group

Involved in discussions of environment challenges and approach to setup new environment

AWARD / CERTIFICATION / TRAINING COMPLETED

No

Name of Institution

Award / Certificate / Training

Year

1

Cognizant Academy

Teradata 13

2011

2

Cognizant Academy

Data Warehouse Best Practices

2011

3

Cognizant Academy

Unix Shell and Perl Scripting

2011

4

Cognizant Academy

Informatica 9

2013

5

Cognizant Academy

Cognizant Certified Teradata Professional

2016

6

Cognizant Academy

Ab Initio (GDE 3.2)

2017



Contact this candidate