Resume

Sign in

Data Etl

Location:
Atlanta, GA
Posted:
April 28, 2020

Contact this candidate

Resume:

Moumita Dutta https://www.certmetrics.com/amazon/public/badge.aspx?i=1&t=c&d=2019-11-23&ci=AWS01021701https://www.certmetrics.com/amazon/public/badge.aspx?i=1&t=c&d=2019-11-23&ci=AWS01021701

Email : adczvu@r.postjobfree.com

Ph: 424-***-****

Professional Summary

ETL and Data Warehouse Technical Lead with more than 13 years of experience in Datawarehouse and Business Intelligence platforms for multiple clients utilizing Waterfall, Agile/Scrum and Lean SDLC Methodologies

Highlights include:

Extensive experience in Requirement Analysis, Design, Development and end to end implementation of Data Warehousing projects utilizing AbInitio, Apache Hadoop and SAP BODS.

Extensive hands-on experience on AbInitio integration with Hadoop to build Datalake for SunTrust Bank Inc.

Good experience in delivering software services in Banking and Capital Market domain. Has conceptual knowledge and practical experience in end-to-end Data warehousing application development using AbInitio. Has handled ETL and reporting logical design implemented through AbInitio to support analytical data.

Extensive experience in leading team and handing projects in onsite-offshore global delivery model.

Designed and developed reusable parameterized graphs and components to be used in various projects thus reduced overall development and testing efforts and increased productivity.

Good experience in Parameter Definition Language (PDL).

Good knowledge in AbInitio Data Profiler.

Extensive experience in writing SQL queries to implement data warehouse projects.

Strong experience in Unix shell scripting. Developed various reusable scripts for project purpore.

Hands on experience of data migration from legacy system to newly built data warehouse.

Strong experience on project support management issues.

Sound domain knowledge for Consumer Banking and Private Wealth Management.

Strong experience to work on Project Estimation.

Played role of AbInitio training anchor, perticipated as active member of Defect Prevention activities and Code review activities in programme level.

Technical Skills

ETL Tool: AbInitio ETL 3.3.4.2, Enterprise Meta>Environment (EME), Graphical Development Environment (GDE), Conduct>It, Metadata-Hub, Informatica PowerCenter 9.1

Data Modeling Tools: MySQL Workbench, SAP PowerDesigner

Operating System: WindowsNT/XP, Windows 2000 Server, Unix

Languages: C, SQL, Unix Shell scripting

Databases: MS SQL Server2000, Oracle, Teradata, DB2 PDOA, Netezza.

Software Tools: Visual SourceSafe, Putty, MS Visio, JIRA

Technologies: Data Warehousing, Apache Hadoop, Hadoop Distributed File System (HDFS), Apache Hive

Domain: Banking And Financial

Applications: ControlM, CA7, FSecure, HP Service Manager, IBM PureData System for Operational Analytics (IBM PDOA), IBM SmartCloud Control Desk, Version 7.5.1

Experience

Client: SunTrust Bank Apr 2019 – Till Date

Role: ETL Technology Lead, L3 Support

Project: ATLAS Data Lake, Enterprise Data Office

Environment: AbInitio 3.3.4.2, Apache Hadoop, HDFS, SQL, ORACLE, DB2 PDOA, SQL Server, Unix, Shell Scripting, Ca7 scheduler

Project Description:

The project supports 280+ Business Intelligence Data Warehousing applications across all line of businesses in the bank. This project supports over 22,000 batch jobs using ETL (Extract, Transform and Load) Ab-Initio code on Data Lake environment, Cognos and QlikView.These applications help business analysts, users in data analysis and future forecast of bad loans and also provide the credit risk management process, credit decision making and related infrastructure needed to identify, assess and monitor risks to achieve superior performance and mitigate losses.

Responsibilities:

Primary responsibilities include providing technical solutions by redevelopment and code fix to production incident using Ab-initio, Bigdata Hadoop technologies, Unix, IBM DB2 database, Hadoop, ca7 and HIVE.

Proactively work to make sure Daily / Weekly / Monthly / Yearly Applications under Marketing and Wholesale line of business are complete on time with correct data. Acknowledging issues reported by users and working end to end to provide solutions to the reported issues with business justifications.

Working on reported incidents within the SLA and based on priority. Identifying the issue and engaging different teams if needed based on requirement. Making sure all the teams of the impacted areas are informed and updated for every incident with proper resolution comments.

Managing Problems in case any incident happens frequently which is impacting the business. Providing ideas and technical solutions to resolve the issues and engaging members from different teams to take care of their part in order to place the fix while adhering to process. Providing short term solutions to business in case the actual solution is time consuming and (or) expensive to client.

Reporting to client or internal management on the continuous improvement activities and conducting audits to ensure adherence to quality metrics. Coordinating and conducting periodic reviews with stakeholders to track overall schedule adherence.

Set up implementation mechanisms in order to ensure that defined process is followed by the project team. Coordinate with track owners, seek inputs, compile and construct scope document including hard and soft commitments.

Secure sign off from all the relevant stakeholders. Establish downstream and upstream traceability. Ensuring effective risk management by logging, characterizing qualitative and quantitative risks and proposing mitigation plans.

Managing and supervising work of onsite and offsite teams and providing feedback on deliverables.

Anticipate customer needs and provide inputs for solution. Contributing to the growth and vitality of the organization by submitting relevant project collaterals and thereby build the Intellectual Property of the organization.

Performing business planning and analysis and providing relevant inputs to leadership teams.

Analyze and constantly improve systems, processes and nature of service being provided to ensure customer delight.

Bring unresolved issues with the client, understand stakeholder views

and resolve the issues.

Client: SunTrust Bank Jan 2017 – Mar 2019

Role: ETL Technical Lead

Project: ATLAS Data Lake, Enterprise Data Office

Environment: AbInitio 3.3, SQL, ORACLE, DB2, SQL Server, Unix, Shell Scripting, Ca7 scheduler

Project Description:

The Enterprise Data Office mission is to reduce risk, reduce cost and enable the growth by providing the most accurate data delivered in a timely manner to meet the increased regulatory demands, cost efficiency and speed to market. To accomplish these a robust, multi-pronged program has been launched viz Atlas Data lake which is a Big Data solution that will allow creation of analytical environment to support the various analytical communities. The data lake is a storage repository that holds a vast amount of raw data in its native format.

The Management Information Group (MIG) Reporting – The Management Information Group (MIG) reporting application will retire the legacy SQL server based reporting application and bring a robust and automated process by providing an integrated and cleansed data from Datalake to adhere to the data governance processes by ensuring the data is accurate and pulled from the definitive data source.

Digital Client Experiences – With the process of implementing this application SunTrust will be able to allow the clients to immediate (overnight) graduation to a higher segment based on their wealth and investment. This segmentation process will drive multiple client services, ie; Credit card rewards points, Mobile deposit limits, real-time offers, financial planning offerings, differentiated client experience.

Responsibilities:

Working as an ETL Technical Team Lead to deliver end to end implementation to the program including Requirement analysis, High level and Detail level Design, development, Unit Testing, Reviews, User interaction and Production implementation.

Design, implement and deploy high-performance, custom applications at scale on Hadoop and AbInitio ETL ( Extract Transform and Load method) following best practices to handle a huge data load by distributing data through parallelism utility in Ab initio ETL and Hadoop distributed file system (HDFS).

Coordination between the clients and the other project stakeholders for any requirements changes, clarifications and approvals.

Involve in creation of Logical and Physical designs how the ETL Framework would setup to accomplish this project requirement adhering to SunTrust environment architecture.

Designing of generic processes for that includes data extraction, Data Quality checks and related transformations.

Performance tuning and reviews of all deliverables from Dev environment to QA. Also responsible to deliver a defect free, quality code and project documentation.

Teaming with test leads to understand the testing process and ensuring the capture of all test scenarios by using Traceability Matrix.

Responsible for production implementation and warranty support activities.

Client: Intercontinental Exchange Oct 2015 – Dec 2016

Role: Data Warehouse ETL Lead

Project: ETL Project to accommodate continuous updates in Trading and Clearing system

Environment: SAP BODS, AbInitio 3.1, SQL, ORACLE, Netezza, Unix, Shell Scripting, Tidal Scheduler

Responsibilities:

Working a Business Analyst and ETL developer in delivering end to end implementation to the program including Requirement gathering and analysis, High level Design, development, Reviews, User interaction and Production implementation.

Analyzing Business, Functional and Technical requirements for ETL process and creating System Requirement Specification to map with business requirements.

Collaborate with data architects for data model management and Conduct data model reviews with project team members.

Capture technical metadata through data modeling tools.

Enforce standards and best practices around data modeling efforts.

Ensure data warehouse and data mart designs efficiently support BI and end user.

Design Abinitio and BODS ETL jobs and writing complex SQL queries to load the data into Netezza from Oracle source system and them applying transformation on the same.

Performance tuning and reviews of all deliverables from Dev environment to QA.

Teaming with test leads to understand the testing process and ensuring the capture of all test scenarios by using Traceability Matrix.

Teaming with reporting team to understand their requirements and expectation on ETL’s final product and work accordingly

Conduct Production Implementation and Warranty support activities

Client:Suntrust Bank Inc. (May 2015 – Oct 2015)

Role: ETL Tech Lead and Business Analyst

Project: DMO-Data Management Operations

The goal of DMO project deals with building and managing metadata repository which will provide single source of 360 degree view on the organization’s metadata information.

Responsibilities:

Focus on data quality and metadata to attain high quality data assets.

Improve operational efficiency across Suntrust Bank by improving strategic planning and decision making.

Building the framework by using MHUB, data Quality, reusable components in profiling and builing data lineage.

Import of Business Metadata like Business Terms, Business Rules, linking the terms to Physical Data Element, establishing lineage, Physical and logical data model imports, metadata from meta environment and business relational metadata are imported into metadata portal to better understand the business framework.

Environment: Abinitio 3.1, MHUB, DB2, MSSQL, Unix, Shell Scripting, CA7 Scheduler

Client:Suntrust Bank Inc. (Jul 2014 – May 2015)

Role: ETL Tech Lead and Business Analyst

Environment: Abinitio 3.1, DB2, MSSQL, Unix, Shell Scripting, Salsforce.com, Mainframe CA7 Scheduler

Project: Delta Debit Re-launch

As part of Delta Debit Relaunch there is a requirement to segregate clients into different tiers based on a pre defined set of conditions. Debit and credit information will be pulled from DIME and Mainframes and the ETL transform process will create 5 tiers of clients, ranging from 0 to 4. Debit and credit clients will be considered in this process to gte assigned with Tier value.

In this Project the bank wants to build a process to collect the individual consumer credit and debit related data and segregate them into multiple tiers. This will enable the bank to analyze rewards and discount processing on a tier level instead of at individual consumer level. The Delta Debit Re-launch Solution shall support providing relationship-based fee discounts for Delta Debit cardholders and tier identifiers for Consumer Credit cardholders.

Responsibilities:

Working a Business Analyst and ETL Lead in delivering end to end implementation to the program including Requirement gathering and analysis, High level Design, development, Reviews, User interaction and Production implementation.

Created entity relationship diagrams and multidimensional data models and diagrams (star schemas)

Developed logical data models and physical data models using PowerDesigner.

Generated SQL scripts and assisted DBA’s in the development of physical database.

Analyzing Business, Functional and Technical requirements for ETL process and creating System Requirement Specification to map with business requirements.

Design Ab Initio Graphs and plans for data transform and loads on UNIX platform.

Performance tuning and reviews of all deliverables from Dev environment to QA.

Teaming with test leads to understand the testing process and ensuring the capture of all test scenarios by using Traceability Matrix.

Conduct Production Implementation and Warranty support activities

Client:Suntrust Bank Inc. (Aug 2013 – Jun 2014)

Role: ETL Tech Lead and Business Analyst

Project: Credit Bureau Provisioning

Environment: Abinitio 3.1, DB2, MSSQL, Unix, Shell Scripting, Salsforce.com, Mainframe CA7 Scheduler

Credit Dispute processing is centralized within the Mortgage segment for both Consumer and Mortgage but existing technology solution to manage those disputes requires significant manual processing to keep data synchronized between the dispute tracking tool, the credit reporting agencies/e-Oscar, and the various systems of record (MSP, ALS, FDR, RMS, etc.). The level of manual effort involved in processing disputes increases the risk of repetitive incorrect reporting even after research of a dispute is completed.

This project deals with end to end implementation of Credit Disputes Data Mart and automate the disputes processing for all direct and indirect disputes.

Responsibilities:

Working as an ETL Lead and Business Analyst in delivering end to end implementation to the program including Requirement gathering and analysis, High level Design, development, Reviews, User interaction and Production implementation.

Analyzing Business, Functional and Technical requirements for ETL process and creating System Requirement Specification to map with business requirements.

Teaming with data modeler, external vendors like E-Oscar and Innovis to perform analysis on the data warehouse schema and Creating high level and detail level design to implement end to end data mart for Cdedit Disputes.

Design Ab Initio Graphs and plans for data transform and loads on UNIX platform.

Performance tuning and reviews of all deliverables from Dev environment to QA.

Teaming with test leads to understand the testing process and ensuring the capture of all test scenarios by using Traceability Matrix.

Conduct Production Implementation and Warranty support activities

Client:Suntrust Bank Inc. (Jun 2012 – Jul 2013)

Role: ETL Tech Lead

Project: Complaint Management Initiative

Environment: Abinitio 3.1, DB2, MSSQL, Unix, Shell Scripting, Salsforce.com, Mainframe CA7 Scheduler

CMI Datamart is intended to build a data mart to aid effective and efficient means for complaint data analytics and reporting. The solution is targeted in addressing the core operational bottlenecks faced by the end-user facing team in gathering the required complaint resolution data spread over federated sources in timely and effective means. The CMI data mart envisages extracting client problem data from SalesForce cloud system and related data from other SunTrust in-house data sources to integrate and finally load them into data mart which can be leveraged to build required reports and dashboards to provide data analytics and presentation to operational team as well as assisting problem management horizon over SunTrust. The solution is intended to provide a data driven platform to measure client satisfaction and efficiency of team handling customer resolution.

Responsibilities:

Analyze the business requirements and Data Model to create end to end mapping sheet of new requirement.

Teaming with data modeler, perform analysis on the data warehouse schema and Creating high level and detail level design to implement end to end data mart for complaint management.

Design and develop Ab Initio Graphs and Plans for data transform and loads on UNIX platform.

Design and develop Ab Initio common components for parallel processing & error handling

Direct Client interaction for design review and for any changes in requirement.

Efficient task allocation within the team, Code review activities, DP activities.

Maintaining various trackers of analysis production defects and providing the recommendation for the same to avoid the same defects in future.

Conducting DP activities in team to enhance team knowledge and prevent future defects.

Client:Capital One Financial Corporation. (Feb 2010 – Jun 2012)

Role: ETL Technology analyst

Project: Mortgage DataWarehouse (MDW) and BAU

Environment: Abinitio 2.15, Teradata, Unix, Shell Scripting, ControlM Scheduler, HP Service Manager

The Project deals with-

Delivering business critical functional subject area data marts (Collateral, REO, Foreclosure, collections & recoveries, Applications and Originations).

Transforming MoRe(Existing DW) to Mortgage Data Warehouse.

Integrating key external data sources.

Development of Mortgage Universal Dataset (MUDS) data mart to develop a business data mart that satisfies at least 80% of the non-operational reporting and analytics data needs of associates in Home Loans.

Responsibilities:

Requirement and Business Process analysis

High level and detail level design of complex ETL strategies

Development and Enhancement of different subject areas like collateral, Reo and Foreclosure in MDW Data warehouse

Building Conduct>It plans and reusable components for extraction, archival and to handle complex CDC logics

Development and Enhancement of MoRe Data warehouse

Provide L3 Support to PMORE and MDW ETL Batch jobs

Data Analysis and Defect/data fixes for production defects

Client: BNP Paribas (Jul 2008 – Jan 2010)

Role: Senior Software Engineer

Project: Bogart Migration

The client is a European Banking Sector. The project deals with

Successful migrate all archived Universes and related reports from Business Objects 5.1.8 to Business Objects XIR2

To convert all Business Object Full Client documents in 5.1.8 to Web Intelligence documents in XIR2

To set test the impact analysis of migration from Business Object version 5.1.8 to BOXIR2

To understand the various risks and complexities involved in the migration. This enables to have a standard migration process and also in deciding an effective timeline for all future migration

To provide a cost effective and comprehensive onsite offshore solution

Responsibilities:

Importing reports from Paris through a BIAR file using Import Wizard and save it in the local folder.

Convert the report from BO 6.5 to webi (XIR2) by using report conversion tools.

Testing of the converted reports using Test Director and redevelopment of reports, which are not able to convert.

Exports the converted and tested webi reports to Paris using a BIAR.

Discussion of issues with onsite and offshore colleagues

Project Documentation.

Client:Bunge (Dec 2006 – Jun 2008)

Role: Software Engineer

Project: Bunge Trade Statistical Center

The client is an integrated global agribusiness and food company operating in the farm-to-consumer food chain. It is the largest soybean processor in USA and one of the world’s largest exports of Soya products. The project deals with creation of a Central Repository from collection of large unstructured data. This repository is made available to Bunge Trading Analysts for analysis and decision-making

Responsibilities:

Design universes as per new requirement and export it to center repository

Design universes and Reports based on existing applications present

Resolving technical and design issues

Performing independent unit testing

Environment: Business Objects 6.5, Oracle 9i

Education

Bachelor of Technology in Computer Science and Engineering, 2006, DGPA 7.81

West Bengal University of Technology, India

Higher Socondary Education (12th Grade), 2002 (77 %)

West Bengal Council of Higher Secondary Education, India

Socondary Education (10th Grade), 2000 (91 %)

West Bengal Board of Secondary Education, India



Contact this candidate