Post Job Free
Sign in

IDMC / ETL / Informatica / BI / Data Engineering

Location:
New York City, NY
Salary:
180000
Posted:
February 07, 2025

Contact this candidate

Resume:

PRASOON NASA

***********@*****.***

203-***-****

Tech Lead, Santander Bank

Data Engineering & Warehousing Expert

Experience: 20+ years

Core Expertise:

Data Integration: Extensive experience in Informatica PowerCenter, Informatica IDMC, and Snowflake, ensuring seamless ETL workflows.

Database Technologies: Proficient in Snowflake, Oracle, SQL Server, and performance optimization for large-scale data warehouses.

Scripting & Automation: Skilled in Shell Scripting on Unix/Linux platforms, enabling efficient batch processing and system automation.

Scheduling Tools: Expertise in Control-M and Autosys for job scheduling and workflow orchestration.

Data Warehousing: Comprehensive knowledge of data modeling, data pipeline development, and enterprise data architecture.

Cloud Platforms: Hands-on experience with modern cloud-based data platforms and integrations, leveraging Snowflake for scalable, cloud-native solutions.

Key Strengths:

End-to-end data lifecycle management.

Problem-solving in complex data ecosystems.

Building and optimizing high-performing, scalable data solutions.

Certifications:

Certified in Informatica PowerCenter 7.1.1

EDUCATION & TRAINING

Bachelor of Engineering from Delhi College of Engineering, Delhi University, New Delhi.

Software Induction Training in TATA Consultancy Services.

PERSONAL ATTRIBUTES:

Excellent communication & interpersonal skills collaborate with customer teams to identify and remedy quality issues.

COMPUTER SKILL SUMMARY

ETL Tools: IDMC / Informatica Power Center 810.4/9.6/9.1/8.6.1/7.1.1/6x/5x /

IDQ 9.6.1

Database: Snowflake / Oracle 11g/9i/8.x, SQL Server 2008, Teradata13.11, Sybase

Languages: PL/SQL (8.X), SQL*Plus, Pro*C and UNIX Shell Scripting

Reporting Tools: Business Objects

Other tools: Rapid SQL, MS Office, MS Project/Visio, MS PowerPoint, Putty

Operating Systems: UNIX, Windows 7 / XP / NT

Version Control: Perforce, Clear Case, Visual Source Safe

Scheduling Tools: Autosys, Control M

WORK EXPERIENCE:

Project Corporate Investment Banking Enterprise Dataware house

Employer Santander Bank, 120 Crawford Corner Rd, Holmdel NJ

Duration Nov ‘19 – Till date

Role ETL/Informatica Lead for Enterprise Datawarehousing

Team Size 25

Responsibilities:

Responsible for leading team for developing ETL applications which primary takes care of Finance, Loans data into Enterprise Dataware house solutions.

Responsible for code migration from Informatica PowerCenter 10.5 to IDMC ( using Snowflake ) and testing for all applications. This included redesigning/dependencies mapping/ feasibility check, functional testing/ manual adjustments and various other aspects required for IDMC.

Used Informatica PowerCenter 10.5 for application development/ Enhancement from various OLTP databases and other applications to the data mart and downstream applications.

Executed all the life cycles of Re-engineering (Analysis, Problem Detection, and Problem Resolution).

Worked on Star Schema design using Data warehouse concepts like Fact Table and Dimension Tables.

Analyzed complex Informatica mappings with extensive use of lookups Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Created and used parameter files to perform different load processes using the same logic.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait, Control).

Developing UNIX scripts and used them to automate the scheduling process

Performing Unit/Integration/User acceptance/Production parallel testing.

Documenting the changes for reference.

Project DBFeeds Application Data Center

Employer Deutsche Bank, 60 Wall St. NYC

Duration Oct ‘17 – Nov 19

Role ETL/Informatica Lead for Shared Services (Assistant Vice President)

Team Size 35

Responsibilities:

Responsible for leading team for DB Feeds applications which primary takes care of Finance, TLM and platform support.

Shared Services Support for 40+ applications.

Used Informatica PowerCenter 9.6 for Bug Fixes from various OLTP databases and other applications to the data mart and downstream applications.

Executed all the life cycles of Re-engineering (Analysis, Problem Detection, and Problem Resolution).

Worked on Star Schema design using Data warehouse concepts like Fact Table and Dimension Tables.

Analyzed complex Informatica mappings with extensive use of lookups Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Created and used parameter files to perform different load processes using the same logic.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait, Control).

Developing UNIX scripts and used them to automate the scheduling process

Performing Unit/Integration/User acceptance/Production parallel testing.

Documenting the changes for reference.

Project Business Intelligence; Sales and Finance Data

Client Automatic Data Processing, Inc. (ADP, NJ)

Duration Mar ‘16 – Oct’ 17

Role ETL/Informatica Development

Team Size 30

Responsibilities:

Requirement gathering, analysis, High Level/ Low Level design of system.

Used Informatica PowerCenter 9.6 for migrating data from various OLTP databases and other applications to the data mart.

Executed all the life cycles of Re-engineering (Analysis, Problem Detection, and Problem Resolution).

Worked with different sources like Relational, Mainframe (COBOL), XML and flat files (CSV).

Created Star Schema design using Data warehouse concepts like Fact Table and Dimension Tables.

Created complex Informatica mappings with extensive use of lookups Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Created and used parameter files to perform different load processes using the same logic.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait, Control).

Exposure in Master Data Management concepts and Methodologies.

Developing UNIX scripts and used them to automate the scheduling process

Performing Unit/Integration/User acceptance/Production parallel testing.

Project Data Masking Management System

Client Morgan Stanley (Manhattan, NY)

Duration Aug ‘15 – Mar 16

Role ETL/Informatica Analyst

Team Size 27

Responsibilities:

Data Masking highly sensitive data from Prod to Non Prod environments.

Requirement gathering, analysis, High Level/ Low Level design of system.

Used Informatica PowerCenter 9.6 for migrating data from various OLTP databases and other applications to the data mart.

Executed all the life cycles of Re-engineering (Analysis, Problem Detection, and Problem Resolution).

Worked with different sources like Relational, Mainframe (COBOL), XML and flat files (CSV).

Created Star Schema design using Data warehouse concepts like Fact Table and Dimension Tables.

Created complex Informatica mappings with extensive use of lookups Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Created and used parameter files to perform different load processes using the same logic.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Developing UNIX scripts and used them to automate the scheduling process

Performing Unit/Integration/User acceptance/Production parallel testing.

Weekly/Daily scrum call meeting with the client and across the various group in multiple locations

Project Control Room Trade Surveillance

Client Barclays Capital (Whippany, NJ)

Duration Oct ‘14 – Aug’15

Role ETL/Informatica Analyst

Team Size 35

Responsibilities:

Requirement gathering, analysis, High Level/ Low Level design of system.

Involved in the design, development and implementation of the on-boarding Convertible Bonds feeds for trade surveillance.

Involved in working on POC for using Mongo DB using Informatica

Used Informatica PowerCenter 9.6.1 for migrating data from various OLTP databases and other applications to the data mart.

Executed all the life cycles of Re-engineering (Analysis, Problem Detection, and Problem Resolution).

Worked with different sources like Relational, Mainframe (COBOL), XML and flat files (CSV).

Created Star Schema design using Data warehouse concepts like Fact Table and Dimension Tables.

Worked on Ralph Kimball methodology, Star Schema modeling, Snowflake modeling to design FACT and Dimension tables.

Created complex Informatica mappings with extensive use of lookups Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Created and used parameter files to perform different load processes using the same logic.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait, Control).

Developing UNIX scripts and used them to automate the scheduling process

Performing Unit/Integration/User acceptance/Production parallel testing.

Weekly/Daily scrum call meeting with the client and across the various group in multiple locations

Documenting the changes for reference.

Project MFX Strategic Insurance Platform

Client MFX FAIRFAX (Morristown, NJ)

Duration Dec ‘07 – Sep ‘14

Role Lead ETL/Informatica Analyst

Team Size 12

Responsibilities:

Work with the TPA Client/Users and the Business Analysts Team to gather the Requirements & Specifications about the Client’s Insurance Claims and Polices data

Analyses the Functional Aspects and translate them into Mapping Design documents between Client and MRAM interfaces adhere to MRACIS standard.

Data Modeling and Database Planning & Design.

Packages, Stored Procedures & Functions, Triggers and Views were created to perform Error Handling & Validation.

Design & Develop Informatica Mappings & Workflows, Oracle Stored Functions, Oracle Views to integrate the Multiple/Heterogeneous Sources, both External & Internal to the Organization.

Automate Workflows, create Pre- & Post- Session Commands, run the scheduled jobs using UNIX Shell Scripting.

Created complex Informatica mappings with extensive use of Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Extensively used PL/SQL for creation of stored procedures.

Performed Performance tuning of targets, sources, mappings and session and performed pipeline partitioning.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Performing Unit/Integration/User acceptance/Production parallel testing.

Project Financial Total Return

Client GE Asset Management (Stamford, CT)

Duration Apr ‘05 – Nov’07 in Stamford CT

Role Lead ETL/Informatica Analyst

Team Size 8

Responsibilities:

Requirement gathering, analysis, High Level/ Low Level design of system.

Involved in the design, development and implementation of the Performance and Attribution (P and A) and building Total Return Data Mart.

Used Informatica PowerCenter 7.1.1 for migrating data from various OLTP databases and other applications to the data mart.

Executed all the life cycles of Re-engineering (Analysis, Problem Detection, and Problem Resolution).

Worked on Ralph Kimball methodology, Star Schema modeling, Snowflake modeling to design FACT and Dimension tables.

Created complex Informatica mappings with extensive use of Aggregator, Union, Filter, Router, Normalizer, Joiner, Sequence generator transformations.

Extensively used PL/SQL for creation of stored procedures.

Performed Performance tuning of targets, sources, mappings and session and performed pipeline partitioning.

Performed unit testing to validate mappings and populate the database

Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.

Involved in UNIX scripts and used them to automate the scheduling process

Performing Unit/Integration/User acceptance/Production parallel testing.

Project Enterprise Datawarehouse Center of Excellency

Client GE Mid Marketing Finance (Danbury, CT)

Duration Jan ’04 – March ’05

Role ETL Analyst

Responsibilities

Analyzed the data movement (ETL) process and procedures for manufacturing data. Identified and assessed external data sources as well as internal and external data interfaces.

Developed Informatica mappings to move data from various databases to Data warehouse (Reporting Database).

Implemented Join, Expression, Aggregate, Rank, Lookup, Update Strategy, Filter and Router Transformations, and Mapplets in Mappings.

Designed, Developed and Maintained of PL/SQL procedures, functions that were used along with Informatica mappings to populate Oracle Tables.

Used TOAD to work with database.

Extensively worked on tuning the mappings and workflows.

Responsible for user defined partitioning to improve the session performance.

Involved in Files FTP directly to the Informatica Server and stage them on a local directory for the session run.

Project iPROCUREMENT

Client GE Plastics, Pittsfield

Duration Jan ’02 to Dec ’03

Role ETL Analyst

Responsibilities:

Developed strategies on ETL designing for data modifications and flow.

Developed mappings in Informatica to create data flow logic.

Coding for Pro*C batches to create data flow logic.

Create shell scripts to automate jobs and other processes.

Developed Stored procedures and other SQL queries required for the project.

P&O NL (UK, Netherlands)

Nov ’00 – Dec ‘01

Software Developer

Responsibilities:

Writing Pro*C batches for the interface for the smooth data flow between different layers of the system.

Project Enterprise Datawarehouse Centre of Excellency

Client GE Mid Marketing Finance (Danbury, CT)

Duration Jan ’00 – Oct ’00

Role Software Developer

Responsibilities:

Create shell scripts to automate jobs and other processes.

Developed Stored procedures and other SQL queries required for the project.

Developing Pro*C batches for the data flow between different interfaces.



Contact this candidate