Post Job Free

Resume

Sign in

Informatica Etl

Location:
Morganville, NJ
Salary:
10
Posted:
April 27, 2021

Contact this candidate

Resume:

PROFESSIONAL SUMMARY:

Business Intelligence Professional with around 8 years’ experience in various domains such as Insurance, Banking, Manufacturing with expertise in Informatica 10.2.0, 9.6.1, 9.1 and 8.6X.

Strong working experience in Informatica PowerCenter 10.2.0, 9.6 9.1 8.6X, used various transformations i.e. Lookup Joiner Expression Aggregator Transaction Control Update Strategy Filter etc.

Experience in designing and implementing of ETL solutions scalable, distributed systems leveraging AWS / GCP (Amazon Web Services / Google Cloud Platform) cloud computing technologies like Informatica, MongoDB4.0

Designed ETL solutions to load Data warehouse from various source systems which was build using Informatica in order to create business centric view to help business decision process.

Worked on Agile methodology using entire software development life cycle (SDLC) including analysis, design, build, testing and documentation.

Highly proficient in the use of SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Defined Functions, Relational Database models and Data integrity, and SQL joins

Created Indexes, Indexed Views in observing Business Rules and creating Effective Functions and appropriate Triggers to assist efficient data manipulation and data consistency

Executed data warehousing / OLTP database projects through the full software life cycle from requirements gathering, data profile, data modeling and technical specifications to design, development and implementation.

Strong knowledge of Data Warehousing methodologies and concepts, including star schemas, snowflakes, dimensional modeling and reporting tools, worked with high volume databases with partitioned data.

Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting. Used Connect Direct MFT utility for file transfers from source to destination servers.

Supported various ETL process in production using Performance tuning Informatica mappings and sessions for optimum performance.

Experience of participation in creating SIT / UAT / Performance Testing and Migration Plans.

SKILLS:

Cloud Platform Tools: AWS batch, GCP Dataproc, AWS EMR, GCP Cloud Scheduler

Software Framework: Spark2.4.5

Data Schema: Star Schema, Snow Flake

O/S: Ubuntu, AIX Unix, Windows

Languages: Unix Shell Scripting, PL/SQL

Databases: Cloud Oracle 12c, Cloud MS SQL Server 15.0, DB2 V9.7, Hive2.3.6, Cloud MySQL Database

Tools: Informatica 10.2 9.6 9.1.0 8.6X, AutoSys, ER/Studio V9, Cisco Tidal Enterprise Scheduler, Cron jobs, Control-M, CD MFT

Other: JIRA, Dimensional Data Modeling, PL/SQL, Bill Inmon and Ralph Kimball methodologies

Software: Visio 2003, HP Quality Center, MFT, Crontab, Data Flow Diagrams, SharePoint, Service Now, Bit-bucket

Development Model: Agile Development, Water Fall

Clusters: Spark Cluster

1199SEIU Funds, New York City, NY 09/2019 – Present Informatica ETL lead Developer

Description:

The 1199SEIU Funds are among the strongest and largest labor-management funds in the nation, providing a range of comprehensive benefits to 400,000 working and retired healthcare industry workers and their families.

Responsibilities:

Managed end-to-end complex data migration requirements, conversion and data modeling for migrating Mycroft Data warehouse.

Work closely with internal teams, partners, and customers to identify and define data migration requirements

Developed Data migration Plan of Mycroft EDW moving data from MSSQL server database to Oracle on Google Cloud Platform/AWS.

Completed Data Flow and Interface Diagrams using GCP to depict involvement of all upstream and downstream systems in order to deliver technical solution of data migration requirements.

Designed and implemented DWH migration solutions to source data from Star schema to Oracle No SQL database. The source Transactions are extracted from V3 Eligibility System and loaded into Oracle using Informatica and Spark workflows.

Implemented various Informatica transformations like Aggregator, Update Strategy, Lookup, Joiner, Transaction Control, Transformation, Mapplet and reusable Transformation etc.

Prepared the mapping documents for Eligibility, Member Coverage, Person, COB, Address, Dependents, Rx Claims etc. tables to migrate data from V3 to Oracle.

Reviewed Primary flag requirements with Business Team to clarify doubts and questions on each scenario to prepare logic to design and implement primary flag.

Validated data between Existing Tables and new Table data using workflow field by field for various tables like Eligibility, Member Coverage, Rx Claims, Lookup Tables etc.

Developed migration test plans. Conduct unit and test for migrated data by comparing between existing data and new migrated data. Support QA team in regression and end to end testing scenarios.

Supported various ETL process in production using Performance tuning Informatica mappings and sessions for optimum performance.

Prepared run books and scheduled jobs with dependencies on tidal Scheduler. Also called scripts and Workflow from Tidal jobs.

Performed DR activities as a part of production support on ETL systems and participated in DR planning and execution phases.

Worked with Support Help desk to create Tivoli alerts for Tidal jobs failures and as needed scheduled meeting to discuss batch failure cause and solutions and work around to complete batch on time.

Experience with working on data issues and batch issues tickets opened in service now tool by users.

Knowledge transitions with ETL development team on new enhancement and deployments and prepared system support document for quick reference.

Environment: Informatica Power Center10.2, Spark2.4.5, Oracle12c, MS SQL Server, Oracle, Aggregation, AWS Batch Scheduler, AWS EMR, GCP Dataproc, GCP cloud Scheduler, Cron jobs

Bank of Tokyo Inc (MUFG), Jersey City, NJ 01/17 – 09/2019

Informatica Developer

Description:

MUFG Bank is Japan's largest bank and one of the world’s largest, with offices throughout Japan and in 40 other countries. We build long-term relationships with our customers, promote real economic growth, and contribute to orderly capital markets that serve society. The welfare of our customers and employees is always top of mind. Learn more through these links.

Responsibilities:

Responsible for the solution design and development of Informatica Efforts to build ETL process for loading Trade Financial data from T360 system to DWH.

Designed and implemented DWH integration Architecture to source data from upstream systems and feed downstream systems. There are more than 20 interfaces that were designed and implemented to bring Trade Finance data into DWH. The Transactions are flowing from T360 to MQ and then feeding downstream systems interacting GPP and OVS and BESS systems for SWIFT / FED messages.

Design and Build the ETL processes to load Core Banking Hub from various source systems in order to perform data cleansing and transformations using various Informatica transformations like Aggregator, Update Strategy, XML Parser, Lookup, Joiner, Transaction Control, Transformation, Mapplet and reusable Transformation etc.

Worked closely with Business Analysts to define conceptual and logical models of DWH design by following the Star Schema methodology of building the data warehouse.

Involved in logical and physical data modelling process using embraced modeling tools.

Designed and developed Data Flow and Interface Architecture Diagrams to depict involvement of all upstream and downstream systems in order to deliver technical solution of data requirements.

ETL to recon data mart in order to create reconcile reports for Trade Finance, Loan, IRIS, Oracle GL, OVS Accounts, OFSA, CMS accounts, AMH, Check processing, Account Balance etc.

Developed test cases and test plans. Conduct unit and test for new and/or modified ETL programs. Support QA team in regression and end to end testing scenarios.

Environment: Informatica 10.2.0, Oracle 12c, Microsoft Project, Embarcadero E/R Studio 8.0, Putty, AIX-Unix, Microsoft Share point, Tidal Scheduler 6.3.8, Shell Scripts, Quest Toad 4.3, PL/SQL, CD MFT.

John Deere, Moline, IL 07/12 – 12/16

Informatica Developer

Description:

John Deere is the brand name of Deere & Company, an American corporation that manufactures agricultural, construction, and forestry machinery, diesel engines, drivetrains used in heavy equipment, and lawn care equipment.

Responsibilities:

Analyzed the business requirements to design, architect, develop and implement highly efficient, highly scalable ETL processes for Inventory data.

Prepare, maintain and publish ETL documentation, including source-to-target mappings and business-driven transformation rules.

Designed and developed the Data Hub for EIM & DIT report requirements that will help business users in business decision process.

Performed architect, design, and develop enterprise data warehouse components including databases and ETL using Informatica.

Designed and implemented product data vault to extract, transform and load the product data with history of changes on the products attributes.

Supported the project teams during the user acceptance testing activities and assists in development of post deployment data maintenance plan

Developed the Data Integration Specifications ETL and Web including data cleansing and data transformation rules.

Defines data/information architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects.

Environment: Power Designer 15, Informatica7.1, AIX, Shell Script, Windows8.1, WinSQL3.5, Microsoft Visio2003, ESP, SAS, COBOL, DB2, autosys.

EDUCATION:

Master of Computer Science



Contact this candidate