Post Job Free

Resume

Sign in

Data Informatica

Location:
Somerset, NJ
Posted:
April 14, 2021

Contact this candidate

Resume:

Suraj Marmal

Phone: +*732-***-**** Mail to: adln7i@r.postjobfree.com

Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS) & Informatica Big Data management

Summary of Expertise:

About 13+ years of experience in IT with a strong background in Data Warehousing, Business Intelligence and ETL process using Informatica PowerCenter 10.2, Informatica BDM 10.2.1, Informatica Cloud Services (IICS) and Oracle SQL, PL/SQL

Experience in Big Data solutions using the Hadoop components like Hive, Sqoop and HDFS etc.,

Experience Create data mapping and workflow using Informatica PowerCenter to extract, transform and load data into the target DBs

Have worked extensively on Data Integration and Application Integration part of IICS

Working experience in AWS cloud technologies including S3, Red shift, Parquet and JSON files

Data Extraction from legacy systems and Load into AWS Red shift

Extensively used Informatica Power Center Designer, Workflow Manager, Repository Manager and Workflow Monitor

Worked extensively with complex mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups, Stored Procedure and Aggregator

Experience in developing Real time applications using Big Data Streaming Technologies –Kafka

Hands on experience in Performance Tuning of sources, targets, transformations and sessions

Expertise in Informatica session partitioning and Oracle Parallel query processing techniques

Expertise in developing SQL and PL/SQL codes by using procedures, functions and packages to implement the business logic in the database server

Experience in UNIX Shell, Perl Scripting and file management in various UNIX environments.

Experience in all phases of Software Development life cycle - including Requirement analysis, design, development, testing, warranty and prod supports

Worked on every phase of the development life cycle including design, development, Coding, debugging, testing and implementation.

Good Knowledge in Data Warehousing Concepts and quick adaptability to changing trends and technologies.

Worked on different Source Systems like Oracle, SQL server and Flat files, Teradata, SAP and DB2 Mainframe

Preparation of the Technical and Understanding document.

Provided production support as per the project’s requirement. Monitored and fixed the production issues

Responsible for understanding data sources/systems and business needs for preparing HLD/LLD’s

Responsible for Design, Development and migration of objects from low to high environment.

Worked extensively in Data warehousing, Data Migration, Data Conversion and Data Profiling projects for highly reputed clients in Health care, Telecom, Retail, Manufacturing and Banking Domain.

Technical Expertise:

ETL TOOLS

Informatica PowerCenter 10.x/9.x/8.x, Informatica BDM 10.x/9.x

HADOOP TOOLS

Hive, HDFS, Sqoop, Kafka, Oozie and HBase

DATA QUALITY TOOLS

Trillium Data Quality v15.7/v7

DATA BASES

Oracle 11g, DB2 9.7, MS SQL Server, Sybase 16

DB & other TOOLS

CDH 5.x, GitHub, Jenkins, Jira, HP Quality Center, IBM Rational ClearCase, IBM Mainframe

PROGRAMMING

SQL, PL/ SQL, Shell, Python

OPERATING SYSTEMS

WINDOWS XP/NT/ 2007, UNIX

DATA MODELING

Star-Schema Modeling, Snowflakes Modeling, FACT & Dimension Modeling

Certifications:

Professional Society / Certification

Member Since / Date Certified

Informatica Developer 8x

Dec`2011

OBIEE 11g

Jan`2013

PowerCenter Data Integration 10 Developer, Specialist Certification

Aug 2019

Ui Path Robotic Process Automation

Apr`2019

AWS Certified Solution Architect -Associate

Apr`2020

PROJECTS DETAILS:

Client: BMW NA Location: Woodcliff lake, NJ, USA

Project Name: TEP – 53 Duration: Aug’19 - till date

Role: ETL Lead

Description: Client is one of the world's largest automobile manufacturers; Client is a leading global supplier of Cars, technologies for automotive and commercial vehicle. Client delivers real-world innovations that make products smarter and safer as well as more powerful, efficient and the ultimate Driving machine.

Responsibilities:

Data Extraction from legacy systems and Load into AWS Red shift

Designed the ETL architecture for Big Data solution and load strategy using the Hadoop components like Hive, Sqoop and HDFS etc.,

Working experience in AWS cloud technologies including S3, Red shift, Parquet and JSON files

Have worked extensively on Data Integration part of IICS

Data migration from legacy system to AWS cloud using ETL IICS

Having hands on experience on migrating Informatica Power Center to IICS (Informatica Cloud)

Involved in Correcting the migrated mapping in IICS post migration from Power Center

Creation of Mapping tasks and Task flows to run and schedule the jobs

Experience in migrating data from an on-premise/traditional big data system, relational databases, PGSQL, data lake and data warehouse to AWS native

Worked on Informatica transformation by using Informatica PowerCenter and Informatica BDM.

Created Hive DDLs, HQLS and Implemented Partitioning, Bucketing, Compaction on Hive tables and worked with different table type formats

Created Complex BDM mappings using various transformations Data Processor, Complex file reader, Java transformation and various transformations

Configured Dynamic mappings, and Custom Framework for the customized workflows / mappings, Parameter generation and guide the project teams to execute the framework

Worked extensively as Informatica Admin support and monitored the tools

Environment: Informatica BDM 10.X, Power Center 10.X, Informatica Cloud Services (IICS), Sqoop, Hive, Streamsets, HDFS, Oracle 11g/12c, HDP: 2.6. x, Kafka and UC4

Client: KBR Inc., Location: Houston, TX, USA

Project Name: ODS Support and Enhancement Duration: Jul’17 – Aug’19

Role: ETL Consultant

Description: Client is a leading global engineering, construction and Services Company supporting the energy, petrochemicals, government services and civil infrastructure sectors. The client is spinning off as an individual company separating from its parent group. A companywide separation program has to be created to establish independent Business Process, Infrastructure, and Applications.

This Project involves migration of an Operational Data Store (ODS) between two major companies. The project involved analyzing the processes and data in the ODS, evaluating data and reporting requirements, separating the data and processes specific to the two companies, redesigning the data feeds from various sources.

Responsibilities:

Designed the ETL architecture for Big Data solution and load strategy

Worked on Informatica PowerCenter and Bigdata Management and created the mappings/applications to work with data from different databases like Hive, Oracle, Teradata, and Flat Files

Configured Dynamic mappings, and Custom Framework for the customized workflows / mappings, Parameter generation and guide the project teams to execute the framework

Developed the Sqoop scripts/jobs in order to import/export data between HDFS/Hive DB and Oracle/Teradata databases

Coordinated with the team for delegating work & rectifying their doubts and reviewed the code.

Participating the Scrum meetings, Stand-up calls, SME meetings and reported the Daily status across all the teams

Involve in Informatica Upgrade activity and automation script creation

Involved in the production support for the Data quality projects, defects analysis and fixing

Closely worked with the Data Stewards on Data quality improvement process and supported them to achieve the goals.

Worked on Informatica PowerCenter, created the mappings/workflows to load the data from different sources like Oracle and Flat Files

Have worked extensively on Data Integration part of IICS.

Data migration from legacy system to AWS cloud using ETL IICS

Environment: Informatica BDE,, Informatica Cloud Services (IICS), Power Center 10.1, Sqoop, Hive, HDFS, Oracle 11g/12c, CDH 5.8/ HDP: 2.6. X, Kafka and UC4

Client: Delphi Technology Inc Location: Bangalore,KA, INDIA

Project Name: Delphi Technology Duration: Jun’12- Jun’17

Role: Software Engineer

Description: Client is the largest dental plan system in the United States. The Delta Dental Plans Association is composed of 39 independent Delta Dental member companies operating in all 50 states, the District of Columbia and Puerto Rico. Delta Dental member companies serve more than one third of the estimated 166 million Americans with dental insurance. Delta Dental of Massachusetts and Dental Quest (a competitor) are subordinate companies of Dental Service of Massachusetts.

Responsibilities:

Study and analyze different SAP domains like Customer, Platform, Supplier, Organization, COA, Material and common

Profiling for Nulls, Blanks, Zero's, Unprintable characters, strange default values, All 9's etc.,

Check that new tables are created correctly with data types and constrains as specified in the design and/or the data model.

Check that target data types match with source data types.

Developed SQL Queries to compare record counts between source data & target data.

Objects

Prepared the analysis and design documents for Informatica mappings.

Monitored and fixed the production issues.

Responsible for reviewing and testing of all objects and entities as per the standards and guidelines consistent with checklist process.

Participated in daily status calls and technical discussions with the onshore team.

Good understanding of requirements and test data needs for testing the application.

Check data is transformed correctly according to various business requirements and rules.

Check the Mapping of fields present in staging level.

Check that all projected data is loaded into the data warehouse without any data loss and truncation.

Check for the duplication of values generated using Sequence generator.

Check for Data type constraints of the fields present in staging and core levels.

Developed SQL Queries to check the count of source and target tables.

Check that ETL application appropriately rejects, replaces with default values and reports invalid data.

Check in target table that no columns are having null values.

Environment: Informatica PowerCenter, Trillium Data Quality, SAP and Oracle 10g and 11g

Client: Harleysville Group Inc Location: Bangalore,KA, INDIA

Project Name: HG-Claim Center Duration: May’11- Jun’12

Role: Software Engineer

Description: Client is one of the leading insurance service providers in US. The client offers a wide variety of insurance products for business and also offers a number of customized programs for certain customer segments. Client provide small to mid-size nonprofit organization with valuable coverage to protect individual directors, officers and the organization itself against a wide range of "professional" liability exposures.

Responsibilities:

Responsible for preparing the analysis and design documents for Informatica mapping.

Responsible for preparing the analysis and design documents for Informatica mapping.

Perform manual data loads and reconcile data.

Understanding claim canter, Policy and Billing data dictionary in Guide ware.

Create / review Technical design documents, Functional Documents.

Prepare / review high level and low-level solution design.

Involved in the documentation process which is based on the requirement specifications.

Interacted with Clients and On-site coordinators to understand the requirements and get clarification of Issues

Developed mappings, conducted unit tests and documented results

Environment: Informatica Power Center 8.x, Oracle 10g, Perl Scripting, Teradata, UC4, Windows XP Professional

Client: Cisco Loc: Bangalore,KA, INDIA

Project Name: Cisco EDW2B Project Duration: Jun’10-May'11

Role: Software Consultant

Description: The EDW2B project was initiated to build the Next Generation Enterprise Data Warehouse on Teradata platform to provide a complete infrastructure and foundation for Cisco's Data Warehouse. The new initiative caters for a single source of integrated collection of data which provides integrated view of Cisco’s business, customers, products, sales etc.

The Cisco's existing Data Warehouse (EDW), stores the historical and operational data of all the measures like bookings, revenue, expense of Cisco Systems, Inc.

Responsibilities:

Created mappings and optimized the mappings for better performance.

Developed mappings as per ETL Specifications.

Developing technical documents, promotion documents and delivering them to the client.

Created mappings using transformations like Lookup, Joiner, Aggregator, Expression, Filter, Router, Sequence Generator and Update Strategy.

Documenting all the processes & maintaining standards as per CMM level 5

Coordinating with the developmen as well as the testing team.

Migration of mappings, sessions and workflows between Test and Production environments

Performed unit testing and tuning of mapping and sessions.

Designing workflows and sessions.

Environment: Informatica Power Center 8.x, Oracle 10G,11G, SQL, Shell Scripting, Windows XP Professional.

Client: Procter & Gamble Co Location: Bangalore,KA, INDIA

Project Name: P &G Data Mart Duration: Apr’09 - June'10

Role: Consultant

Description: Procter& Gamble Co. (P&G) is an American Fortune’s corporation. It manufactures a wide range of consumer goods. As of 2010, P&G is the 6th in Forune’s Most Admired Company. P&G is the 23rd largest US Company by revenue and 14th largest by profit.

The main objective of this project is to migrate the existing warehouse i. e Common Data Warehouse (CDW) to new Data Warehouse i.e., Atomic Data Warehouse (ADW) to reduce the cost, maintenance and servers. Currently having 21 Data warehouse Applications operating in USA, UK, ASIA and EUROPEAN regions in CDW. Each application may have one / more instances based on number of regions it has been operating. This project intended to migrate all applications that are operating in different instances for all regions into one Global instance to ADW.

Responsibilities:

Design the Mappings and Mapplets using various transformations.

Created Informatica ETL mappings using different transformations like Source Qualifier, Filter, Aggregator, Expression, and lookup transformation, Sequence Generator, Router and Update Strategy.

Extensively involved in business and functionality requirement analysis, understanding source systems thoroughly by preparing the design process flow documentation.

Preparation of technical specification for the development of Extraction, Transformation and Loading data into various stage tables.

Design ETL processes using Informatica tool to load data from Oracle source systems and Flat files into Oracle Target database

Debugging the mappings using debugger in Informatica.

Performed unit testing and tuning of mapping and sessions.

Designing workflows and sessions.

Environment: Informatica Power Center 8.x, Oracle 10G,11G, SQL, Shell Scripting, Windows XP Professional.



Contact this candidate