Post Job Free

Resume

Sign in

Informatica Developer Etl

Location:
Irvine, CA, 92606
Posted:
May 17, 2022

Contact this candidate

Resume:

Md Mizanur Rahman

ETL – Informatica Developer

Los Angeles, CA 90020

M +1-213-***-****

adq282@r.postjobfree.com

LinkedIn id: https://www.linkedin.com/in/md-rahman-585121121

Skype id: mdrahman.dev

Citizenship Status: US Citizen

EDUCATION:

Bachelor’s In computer science

Southeast University

Dhaka, Bangladesh

Year: 2008

CGPA: 3.06 out 0f 4

SKILL SET:

ETL Tools:

Informatica Power Center 10.4.1/10,2/10.1/9.6.1

Operating System:

Windows 08/07/XP/2003, UNIX, Linux

RDBMS:

Oracle 11g/10g/9i, SQL Server

Programming Languages:

SQL, Putty, WinSCP

Web Technologies:

HTML, XML, JAVA script

Scheduling Tools

Autosys

Other Tools:

SQL, Toad, MS Office, ITSM BMC Remedy

SUMMARY:

Over 7 years of experience in Development, Implementation and Production support of Data Warehousing with Informatica Power Centre 10.4.1, UNIX, Linux and Autosys.

Expertise in RDBMS like Oracle 11g/10g, MS Access, SQL, SQL Server, TOAD etc.

Worked on developing and implementing complex business rules by creating Mappings, Mapplets, Sessions, Workflows, and worklets, Reusable or non-reusable objects for data loads, Data migration and Production Support.

Knowledge on Snowflake and Star Schema.

Knowledge of complete SDLC including Requirement analysis and gathering, development and Testing, End to End implementation.

Strong Experience in Execution, schedule workflows using Informatica default Scheduler, Autosys to load data from Source to Target.

Using agile methodology for SDLC and utilize scrum meetings for creative and productive work.

Extensively used various transformations such as Source Qualifier, Filter, Expression, Update Strategy, Look-up, Sequence Generator, Joiner, Router, Aggregator, Normalizer, Transaction Control and Union Transformation.

Expertise in using properties and configuration of Informatica clients and Utilities- Source Analyzer, Target Designer, Mapping Designer, Workflow Manger, Workflow Monitor, Mapplet, Worklet, and Transformation Developer.

Design and developed complex mappings such as Slowly Changing Dimensions Type 2 (SCD Type 2).

Good Knowledge on Performance tuning using Partition, Pushdown Optimization and CDC.

Experienced in requirements gathering, Document Analysis, planning, build, and maintenance

Ability to meet deadlines and handle pressures coordinating multiple tasks in project environment, excellent communication, interpersonal, analytical skills.

Also, I have experience to coordinate with offshore team (Leading experience).

Perform network / system troubleshooting and transaction research to analyze and resolve endpoint problems in both the test and production environment.

Act self-directed, performing day to day tasks and providing administrative application support.

PROFESSIONAL EXPERIENCE:

Bank of America

Plano, TX

ETL/Informatica Developer: November 2021 – May 2022

Bank of America Financial Corporation is an American bank holding company specializing in credit cards, home loans, auto loans, banking, and savings products. The company measured in terms of total assets and deposits. Bank of America is the second-largest bank holding company in the United States when ranked by assets and deposits. The bank has 4300 branches and 16,000 ATMs.

Responsibilities:

Design and developed complex mappings by using Lookup transformation, Expression, Sequence generator, Update Strategy, Aggregator, Router to implement complex logics while create mappings.

Developed mappings workflow using Informatica to load data from homogeneous and heterogeneous sources and Targets such as Relational tables, Flat files, Excel Files.

Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplets designer using Informatica Power Center.

Contributed and actively providing comments for user stories review meeting within AGILE SCRUM environment.

Extensively worked on SQL override in Source Qualifier, Look Up, and Aggregator Transformation for better performance.

Hands on experience using query tools like TOAD, SQL Server.

Designed and developed stored procedures, tuned SQL queries for better performance.

Implemented slowly changing dimension (SCD Type 2) in various mappings.

Used Incremental Aggregation technique to load data into aggregation tables for improved performance.

Created Directory, Parameter file, and Shell Scripts to interact Internal and external environments.

Created and maintained Job Groups, Jobs, Job activity on Autosys to schedule workflow and provided Informatica job Support.

Worked on different Tasks in Workflow Manager like session, event raise, event wait, decision, e-mail, command, Assignment, Timer, scheduling of the workflow.

Experience in writing UNIX and automation of the ETL processes using UNIX shell scripting.

Environment: Informatica Power Center 9.6.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Oracle 11g/10g, SQL, UNIX, Erwin, Putty, WinSCP, Windows XP, Autosys.

Deloitte

State of Florida

ETL/Informatica Developer: September 2021 – Present

Deloitte is a multinational professional services network with offices in over 150 countries and territories around the world. During this project I worked closely with the ETL team, customers, business analysts and other colleagues in the IT department to analyze operational data sources, determine data availability, define the data warehouse schema and develop ETL processes for the creation, maintenance and overall support of the data warehouse.

Responsibilities:

Understand and offered Informatica expertise to design mapping documents with data architect, Project Lead, and Manager.

Integrated data across different Homogeneous and Heterogeneous sources and Targets systems like Relational Database, Flat Files.

Extracted data from various sources like Flat files, Oracle and loaded it into Target systems using Informatica Power Center.

Extensively worked on Informatica clients such as Source Analyzer, Target Designer, Designer, Transformation, Mapplet and Mapping Designer to designed, developed, and tested complex mappings and Mapplets to load data from external flat files and RDBMS.

Extensively used various transformations like Source Qualifier, Joiner, Aggregators, connected & unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.

Implemented restart strategy, debugging, root cause analysis, and error handling techniques to recover failed sessions.

Worked on Autosys Schedulers to create Job Group, Jobs, scheduled workflows and.

Created UNIX, Shell Script, and Parameter Files to create user friendly, maintainable, and reusable objects.

Environment: Informatica Power Center 10.51, XML, SQL Server, UNIX, Putty, Windows, Autosys.

Conduent

Florham Park, NJ

ETL Informatica Developer/Production Support Specialist: July 2021 – September 2021

Conduent is dedicated to making business decisions that reflect its commitment to improving the health and well-being of its members, associates, the communities. During this project I am working closely with the ETL team, customers, business analysts and other colleagues in the IT department to analyze operational data sources, determine data availability, define the data warehouse schema, and develop ETL processes for the creation, maintenance and overall support of the data warehouse.

Responsibilities:

Integrated data across different Homogeneous and Heterogeneous sources and Targets systems like Relational Database, Flat Files.

Extracted data from various sources like Flat files, XML, Oracle and loaded it into Target systems using Informatica Power Center.

Extensively worked on Informatica clients such as Source Analyzer, Target Designer, Designer, Transformation, Mapplet and Mapping Designer to designed, developed, and tested complex mappings and Mapplets to load data from external flat files and RDBMS.

Extensively worked on Informatica Repository Manager, Designer, Workflow Manager and Workflow Monitor.

Imported and Exported tables and Informatica objects from Database to Informatica to Desktop and vice versa.

Performed Code Migration from DEV to TEST both in Informatica, UNIX environments, and created migration documents.

Created ODBC and Relational Connections and File Path.

Extensively worked on performance tuning using Informatica best Practices.

Implemented restart strategy, debugging, root cause analysis, and error handling techniques to recover failed sessions.

Integrated Data from source system to Stage database, Load Layer and Data Warehouse using Informatica Clients and process.

Extensively used various transformations like Source Qualifier, Joiner, Aggregators, Connected & Unconnected Lookups, Filters, Router, Expression, Rank, Union, Normalizer, Update Strategy and Sequence Generator.

Wrote SQL Override and performed data validation in the Data Warehouse and Data-Mart.

Worked on Autosys to create Job Group, Jobs, scheduled workflows, and Provided Job support proactively when required.

Created Parameter Files to create user friendly, maintainable, and reusable objects.

Environment: Informatica Power Center 10.4.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Oracle SQL DEVELOPER, UNIX, Putty, WinSCP, Autosys, ITSM BMC Remedy, Windows 10.

Wells Fargo

Fremont, CA

ETL Informatica Developer/Production Support Specialist: June 2020 – June 2021

Wells Fargo is one of the largest financial organizations focusing on consumers and commercial clients. This Data mart is projected to assist Sales and Marketing Department to analyze their sales and categorize their customers based on significant portfolio of services including checking accounts, savings accounts, personal loans and geographical area. Using different ad-hoc analysis, Warehouse is supposed to assist in defining strategy for each customer category.

Responsibilities:

Interacted with the Business Users in gathering and analyzing the Business requirements.

Worked on Designing, Development and Testing of Workflows and Worklets according to Business Process Flow.

Created mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.

Extensively involved in writing ETL Specifications for Development and conversion projects.

Created technical design specifications for data Extraction, Transformation and Loading (ETL).

Extensively worked on Change Data Capture/Incremental loading of SCD Type I/II.

Design the ETL process and schedule the stage and mart loads for the data mart.

Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer for defining Source and Target and coded the process from source system to data warehouse.

Extensively used transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement business logic.

Extensively created Re-usable Transformations and Mapplets to standardized Business logic.

Extracted the data from various Flat files and Loaded in Data warehouse Environment and written Unix Shell scripts to move the files across the Servers.

Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.

Used Variables and Parameters in the mappings to pass the values between mappings and sessions.

Used Unix Shell Scripts to automate pre-session and post-session processes.

Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.

Coordinated with testing team to make testing team understand Business and transformation rules that have been used throughout ETL process.

Environment: Informatica Power Center 10.4.1, Oracle11g, SQL, Windows, UNIX, IP Switch, Putty, WINSCP, AutoSys, GitHub, ITSM BMC Remedy.

Farmers Insurance

Woodland Hills CA

Informatica Developer/ Production Support: May 2019– March 2020

Farmers Insurance is one of the largest insurance firms in the world. The primary objective of the project was to construct an aggregated data warehouse (DWH) for analytics and reports used by business, risk management, asset management groups.

Responsibilities:

Design Transform and Load data into Enterprise Data Warehouse tables using Informatica from the legacy systems and load the data into targets by ETL process through scheduling the workflows.

Implemented appropriate Error handling logic, data load methods and capturing invalid data from the source system for further data cleanup.

Developing shared objects in Informatica for source/target/lookup transformations, developed complex mappings, sessions/workflows/worklets, database connections.

Entered metadata descriptions at both the transformation and port level in complex mappings

Design the ETL process and schedule the stage and mart loads for the data mart.

Debugging and Troubleshooting Informatica Mappings.

Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.

SQL Query tuning including Explain Plan, using Optimizer Hints and Indexes.

Analyzed the Specifications and involved in identifying the source data needs to be moved to data warehouse

Worked closely with the Requirements Manager and business Analysts on the requirements collection process.

Assisted the tester in developing the test cases and reviewed them.

Load data on Snowflake and Star Schema and data migration from lower environment to upper environment.

Environment: Informatica Power Center 10.1.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Oracle SQL DEVELOPER 17.3, UNIX, Putty, WinSCP, Autosys, ITSM BMC Remedy, Windows 10.

Kaiser Permanente

Pleasanton, CA

ETL/Informatica Developer: November 2018 – May 2019

Kaiser Permanente is an innovative leader in the health and well-being industry, Kaiser is dedicated to making business decisions that reflect its commitment to improving the health and well-being of its members, associates, the communities. During this project I worked closely with the ETL team, customers, business analysts and other colleagues in the IT department to analyze operational data sources, determine data availability, define the data warehouse schema, and develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse.

Responsibilities:

Understand and offered Informatica expertise to design mapping documents with data architect, Project Lead, and Manager.

Integrated data across different Homogeneous and Heterogeneous sources and Targets systems like Relational Database, Flat Files.

Extracted data from various sources like Flat files, Oracle and loaded it into Target systems using Informatica Power Center.

Extensively worked on Informatica clients such as Source Analyzer, Target Designer, Designer, Transformation, Mapplet and Mapping Designer to designed, developed, and tested complex mappings and Mapplets to load data from external flat files and RDBMS.

Extensively worked on Informatica Repository Manager, Designer, Workflow Manager and Workflow Monitor.

Imported and Exported tables and Informatica objects from Database to Informatica to Desktop and vice versa.

Performed Code Migration from DEV to TEST both in Informatica, UNIX environments, and created migration documents.

Created ODBC and Relational Connections and File Path.

Implemented restart strategy, debugging, root cause analysis, and error handling techniques to recover failed sessions.

Integrated Data from source system to Stage database, Load Layer and Data Warehouse using Informatica Clients and process.

Extensively used various transformations like Source Qualifier, Joiner, Aggregators, Connected & Unconnected Lookups, Filters, Router, Expression, Rank, Union, Normalizer, Update Strategy and Sequence Generator.

Worked on Tivoli Schedulers to create Job Group, Jobs, scheduled workflows, and Provided Job support proactively when required.

Created Parameter Files to create user friendly, maintainable, and reusable objects.

Load data on Snowflake and Star Schema and data migration from lower environment to upper environment.

Environment: Informatica Power Center 10.2 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Oracle SQL DEVELOPER 17.3, UNIX, Putty, WinSCP, Autosys, Windows 10.

CAPITAL ONE FINANCIAL

MADISON, WI

ETL/Informatica Developer: March 2016 – November 2018

Capital One Financial Corporation is an American bank holding company specializing in credit cards, home loans, auto loans, banking, and savings products. The company measured in terms of total assets and deposits, Capital One is the thirteen largest bank holding company in the United States. Capital One is the eighth-largest bank holding company in the United States when ranked by assets and deposits. The bank has 812 branches including 10 café style locations for its Capital One 360 brand and 2,000 ATMs. Capital One Financial is ranked #112 on the Fortune 500 and also conducts business in Canada and the United Kingdom. The company helped pioneer the mass marketing of credit cards in the 1990s.

Responsibilities:

Involved and understanding the Business requirements/ discuss with Business Analyst, analyzing the requirements and helped architect preparing business rules.

Design and developed complex mappings by using Lookup transformation, Expression, Sequence generator, Update Strategy, Aggregator, Router to implement complex logics while create mappings.

Developed mappings workflow using Informatica to load data from homogeneous and heterogeneous sources and Targets such as Relational tables, Flat files, Excel Files.

Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplets designer using Informatica Power Center.

Contributed and actively providing comments for user stories review meeting within AGILE SCRUM environment.

Extensively worked on SQL override in Source Qualifier, Look Up, and Aggregator Transformation for better performance.

Hands on experience using query tools like TOAD, SQL.

Designed and developed stored procedures, tuned SQL queries for better performance.

Implemented slowly changing dimension (SCD Type 1 & 2) in various mappings.

Used Incremental Aggregation technique to load data into aggregation tables for improved performance.

Created Directory, Parameter file, and Shell Scripts to interact Internal and external environments.

Created and maintained Job Groups, Jobs, Job activity on Autosys to schedule workflow and provided Informatica job Support.

Worked on different Tasks in Workflow Manager like session, event raise, event wait, decision, e-mail, command, Assignment, Timer, scheduling of the workflow.

Experience in writing UNIX and automation of the ETL processes using UNIX shell scripting.

Prepared ETL standards, Naming Conventions and wrote ETL flow documentation.

Environment: Informatica Power Center 9.6.1 (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Oracle 11g/10g, SQL, UNIX, Erwin, Putty, WinSCP, Windows XP, Autosys.

Humana Health Care

Los Angeles, CA

ETL/Informatica Developer: January 2015 – January2016

Humana Health Care Group is an innovative leader in the health and well-being industry, Humana is dedicated to making business decisions that reflect its commitment to improving the health and well-being of its members, associates, the communities. During this project I worked closely with the ETL team, customers, business analysts and other colleagues in the IT department to analyze operational data sources, determine data availability, define the data warehouse schema, and develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse.

Responsibilities:

Understand and offered Informatica expertise to design mapping documents with data architect, Project Lead, and Manager.

Integrated data across different Homogeneous and Heterogeneous sources and Targets systems like Relational Database, Flat Files.

Extracted data from various sources like Flat files, Oracle and loaded it into Target systems using Informatica Power Center.

Extensively worked on Informatica clients such as Source Analyzer, Target Designer, Designer, Transformation, Mapplet and Mapping Designer to designed, developed, and tested complex mappings and Mapplets to load data from external flat files and RDBMS.

Integrated Data from source system to Stage database, Load Layer and Data Warehouse using Informatica Clients and process.

Created shortcuts for reusable source/target definitions, Reusable Transformations, Mapplets in Shared folder.

Extensively used various transformations like Source Qualifier, Joiner, Aggregators, connected & unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.

Implemented restart strategy, debugging, root cause analysis, and error handling techniques to recover failed sessions.

Worked on Autosys Schedulers to create Job Group, Jobs, scheduled workflows, and Provided Job support proactively when required.

Created UNIX, Shell Script, and Parameter Files to create user friendly, maintainable, and reusable objects.

Environment: Informatica Power Center 9.6.1, Mainframe, XML, SQL, Erwin4.0, UNIX, Putty, Windows.

Jackson National

Lansing MI

Informatica Developer: November 2013– November 2014

Jackson National is one of the largest insurance firms in the world. The primary objective of the project was to construct an aggregated data warehouse (DWH) for analytics and reports used by business, risk management, asset management groups.

Responsibilities:

Designed and deployed overall ETL strategy including SCD’s, Partition Management, Materialized Views and other complex mapping logics.

Transform and Load data into Enterprise Data Warehouse tables using Informatica from the legacy systems and load the data into targets by ETL process through scheduling the workflows.

Implemented appropriate Error handling logic, data load methods and capturing invalid data from the source system for further data cleanup.

Developing shared objects in Informatica for source/target/lookup transformations, developed complex mappings, sessions/workflows/worklets, database connections.

Entered metadata descriptions at both the transformation and port level in complex mappings

Design the ETL process and schedule the stage and mart loads for the data mart.

Debugging and Troubleshooting Informatica Mappings.

Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.

SQL Query tuning including Explain Plan, using Optimizer Hints and Indexes.

Analyzed the Specifications and involved in identifying the source data needs to be moved to data warehouse

Assisted the tester in developing the test cases and reviewed them.

Load data on Snowflake and Star Schema and data migration from lower environment to upper environment.

Environment: Informatica Power Center 8.1, ORACLE 9i, UNIX Shell Scripts, WinSCP, Putty.



Contact this candidate