Post Job Free

Resume

Sign in

ETL Informatica Developer

Location:
Jersey City, NJ
Salary:
65
Posted:
April 12, 2021

Contact this candidate

Resume:

PRAHARSHINI

Cell: 469-***-**** Email: adlmm2@r.postjobfree.com

Professional Summary:

•8+ years Extensive experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica PowerCenter, Teradata.

•Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un-connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.

•3+ years of Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM (Siperian).

•Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modelling, Ralph Kimball Approach, Star/Snowflake Modelling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modelling.

•Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.

•Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modelin

•Strong understanding of OLAP and OLTP Concepts.

•Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.

•Experience in SQL, PL/SQL AWS S3 and UNIX shell scripting.

•Hands on experience working in LINUX, UNIX and Windows environments.

•Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.

•Having Experience in working quick view tools along with data extractions from other tools to load data and other Quick view for bult queries and reports .

•Good knowledge In Sap Bi tools related extraction in data modeling with Olap & Analytical data among sd modules and other repot generation

•Good knowledge on data quality measurement using IDE.

•Extensive ETL experience using Informatica PowerCenter (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.

•Strong experience in Dimensional Modelling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modelling using Erwin and ER-Studio.

•Experience in designing and Developing complex Mappings using Informatica PowerCenter with Transformations such as Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, XML generator, XML parser, Stored Procedure, Sorter and Sequence Generator.

•Working experience using Informatica Workflow Manager to create Sessions, Batches, and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions.

•Experienced in Performance tuning of Informatica and tuning the SQL queries.

•Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration, and user acceptance testing.

•Experience in handling initial/full and incremental loads.

•Expertise in scheduling workflows Windows scheduler, Unix, and scheduling tools like CRTL-M &Autosys

•Designed, Installed, Configured core Informatica components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modelling.

•Experience in support and knowledge transfer to the production team.

•Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.

•Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.

TECHNICAL SKILLS:

Operating Systems

Unix, Linux, and Windows.

Programming and Scripting

C, C++, Java, .Net, Perl Scripting, Shell Scripting, VBA, PL/SQL, T-SQL.

ETL Tools

Informatica PowerCenter 10.2.0/10.1.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), SSIS, Salesforce, DataStage, etc.

Database Tools

SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Snowflake,Teradata, AQT v10 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), AWS, SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling Tools

Informatica Scheduler, CA Scheduler (Autosys), ESP, Maestro, Control-M.

MDM Packages:

Informatica MDM Multi Domain Edition 10.0, 9.7.1, 9.5, 9.1. Informatica Data Director (IDD) 10.0, 9.7, 9.5. Informatica Data Quality (IDQ) 10.0, 9.6, DDM 9.6

Conversation/Transformation Tools

Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology

Agile, Waterfall

Domain Expertise

Insurance/Finance

RDBMS

SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008, MS Access 7.0/2000.

Professional Experience:

Client: Express Scripts Inc, Jersey City, NJ Dec’ 2019 – Till Date

Sr. ETL Informatica Developer

The Enterprise Data Strategy team within Global Data & Analytics (GD&A) is built to help CIGNA drive improvement in the accessibility and quality of our data assets and tools and allow us to deliver data solutions faster. Core Principles of this application is to validate the information, minimize the time spent on user input, auto population of information from different systems, validations on key information, ability to interface with the policy systems, reports generation, Security and Auditing abilities.

Responsibilities:

Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica PowerCenter.

Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.

Experienced in scheduling Sequence and parallel jobs using DataStage, UNIX scripts and scheduling tools.

Repartitioned job flow by determining DataStage PX best available resource consumption.

Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.

Coded Teradata BTEQSQL scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.

Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, informatica Server monitoring, UNIX file system maintenance/clean-up and scripts using Informatica Command line utilities.

Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.

Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.

Programmed using used Java, Linux, PHP, Ruby, Phyton, R, Informatica, Tableau

Developed and configured various mappings and workflows for reading and writing the data to JMS (JAVA Message Service) Queues, using Application Source qualifier

Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling

Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality

Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.

Involved In Data extraction As third part with Sap Bi tools to work with transaction data in loading to built the data modeling & extractors .

Involves in Quick view tools made reports in users to clients among dash boards & story boards To generate reports in mobile view for customers

Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.

Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.

Experience in creation of ETL Mappings and Transformations using Informatica PowerCenter to move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners

Responsible in preparing Logical as well as Physical data models and document the same

Performed ETL code reviews and Migration of ETL Objects across repositories.

Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.

In - depth understanding of SnowFlake cloud technology

Played key role in Migrating Teradata objects into SnowFlake environment.

Experience with Snowflake Virtual Warehouses

Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.

Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Written UNIX shell scripts to load data from flat files to Netezza database.

Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Centre as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.

Developed ETL's for masking the data when made available for the Offshore Dev. team

Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission

Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production to meet the SLA's

Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.

Scheduled Informatica workflows using OBIEE& along with Bigdata

Involved in implementing change data capture (CDC) and Type I, II, III slowly changing Dimensions

Developed functions and stored procedures to aid complex mappings.

Environment: Environment: Informatica PowerCenter 10.x/9.6, Java, Oracle 11g, Snowflake, Teradata, PL SQL, AWS, SQL developer,

TOAD, Putty, Bigdata, Quick view Unix,

Client: CITI Group, Irving, TX Oct ’2019 - Dec’2019

Sr. Informatica PowerCenter Developer

Mexico Business Driver indicates the business requirements for the integration of business driver-based balance sheet and income forecasting in Financial Resource Management (FRM), a financial forecasting ecosystem that allows the business to consume, analyze, and report Operational Plan and Outlook

Responsibilities:

•Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.

•Migration of code between the Environments and maintaining the code backups.

•Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.

•Involved in the Unit Testing and Integration testing of the workflows developed.

•Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.

•Designed table structure in Netezza.

•Involved in Database migrations from legacy systems, SQL server to Oracle and Netezza.

•Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.

•In-depth knowledge of Data Sharing in Snowflake.

•In-depth knowledge of. Snowflake Database, Schema and Table structures.

•Experience in using Snowflake Clone and Time Travel

•Profiled data on Hadoop to understand the data and identify data quality issues

•Imported and exported data from relational databases to Hadoop Distributed file system using Sqoop.

•Developed shell scripts for running batch jobs and scheduling them.

•Involved in Production Support

Environment: Informatica PowerCenter 9.6, Oracle 11g, SQL Server, PL/SQL, Unix AWS, and WINSCP, Netezza, Snowflake,

Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.

Client: Coca-Cola, Hyderabad (Capgemini India) Jun’2017 – Jun’ 2019

Informatica ETL / MDM Developer

Responsibilities:

•Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapp let Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL, and Web Service transformations.

•Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.

•Assisted in building the ETL source to Target specification documents

•Effectively communicate with Business Users and Stakeholders.

•Participated in design, development, testing, automation, and support of MDM domains

•Work on SQL coding for overriding for generated SQL query in Informatica.

•Involve in Unit testing for the validity of the data from different data sources.

•Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.

•Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.

•Perform Data Conversion/Data migration using Informatica PowerCenter.

•Involve in performance tuning for better data migration process.

•Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.

•Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.

•Deployed new MDM Hub for portals in conjunction with user interface on IDD application.

•Configured match rule set property by enabling search by rules in MDM according to Business Rules.

•Deployed new MDM Hub for portals in conjunction with user interface on IDD application.

•Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.

•Configured match rule set property by enabling search by rules in MDM according to Business Rules.

•Create UNIX shell scripts for Informatica pre/post session operations.

•Automated the jobs using CA7 Scheduler.

•Document and present the production/support documents for the components developed when handing-over the application to the production support team.

•Created Data Model for the DataMart’s.

•Used materialized views to create snapshots of history of main tables and for reporting purpose

•Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5

•Contact with Informatica tech support group regarding the unknown problem

•On-Call support during the weekend

•Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production to meet the SLA's

•Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

•Prepared SQL Queries to validate the data in both source and target databases.

Environment: Informatica 9.5/8.6, Oracle 11g, Informatica MDM, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE Unix, Flat files, Teradata

Client: Visteon, Chennai (Capgemini India) May’ 2015 – Jun ‘2017

Informatica ETL Developer

Responsibilities:

• Interacted with Business users to understand the Business requirements and prepared technical documents for implementing the solutions as per business needs.

•Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes.

•Involved in logical and physical data modelling and analysed and designed the ETL processes.

•Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.

•Extensively used Informatica client tools to extract data from different sources like flat files and oracle.

•Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.

•Experienced in creating and maintaining the entity objects, hierarchies, entity types, relationship objects and relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation and Informatica Data Director (IDD).

•Worked on data cleansing using the cleanse functions in Informatica MDM.

•Participated in the development and implementation of the MDM decommissioning project using Informatica PowerCenter that reduced the cost and time of implementation and development.

•Publishing Data using Message Queues to notify external applications on data change in MDM Hub Base Objects.

•Involved in SIF integration to develop xml code for external applications to perform search match API calls against MDM Hub Data.

•Good exposure in Informatica MDM where data Cleansing, De-duping and Address correction were performed.

•Performed match/merge and ran match rules to check the effectiveness of MDM process on data.

•Deployed new MDM Hub for portals in conjunction with user interface on IDD application

•Created IDQ mappings with key generator, labeler, standardizer, match, consolidator, sorter, address validator transformations etc.

•Used Teradata utilities Fastload, multiload, tpump to load data.

•Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.

•Used Informatica Workflow Manager for Creating, running the Batches and Sessions, and scheduling them to run at specified time.

•Created users, tables, triggers, stored procedures, joins and hash indexes in Teradata database.

•Performed performance tuning on Teradata Tables.

•Used push down optimization techniques with Teradata databases.

•Involved in Debugging, Troubleshooting, Testing, and Documentation of Data Warehouse.

•Creating solution documents for developed mappings.

•Involved in solving day-to-day problems, giving support to the users.

•Communicated issues and progress to project manager.

Environment: Informatica PowerCenter 9.5.1, Informatica MDM, IDQ 9.5.1, Oracle 11g/10g, Teradata, SQL, PL/SQL, TOAD 9.6, Windows XP

Indi Tech, India Sept ’2012- Apr ’2015

Informatica Developer

Responsibilities:

•Logical and Physical data modelling was done using Erwin for data warehouse database in STAR SCHEMA.

•Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.

•Worked on Informatica Source Analyzer, Mapping Designer and Transformations.

•Proficient in SQL queries.

•Have good concept of ETL and Datawarehouse.

•Coordinated with Team members who are working on Power Center 9.1.1 and 9.5.1 (ETL Tool) to extract data from TRP source.

•Created Extraction, Transformation and Loading scripts to populate data-warehouse from operational database.

•Designed and developed aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.

•Created Mappings and validated them to run those mappings in the Workflow Manager.

•Created sessions and validated them using Informatica Workflow Manager.

•Scheduled and monitored ETL jobs using Autosys and workflow manager.

•Involved in fixing the QA issues.

•Involved in supporting SIT environment code migration activity

Environment: Informatica 8.6, Oracle 10g,

EDUCATION:

Bachelor of Engineering (B.E.) SRM University, India

Major: Computer Science and Engineering



Contact this candidate