Post Job Free

Resume

Sign in

Power Center Data Modeling

Location:
Sioux Falls, SD
Posted:
November 14, 2023

Contact this candidate

Resume:

Harika Devi

605-***-****

ad05dz@r.postjobfree.com

PROFESSIONAL SUMMARY

●ETL Developer with 9 years’ Experience in IT Software field extensively worked on ETL tool Informatica (10.4,10.2 and 9.6 and 8.x) and oracle Extensively worked on Oracle, SQL and reporting tools like Tableau and Obiee 10/11g.

●Extensively worked in developing ETL for supporting Data Extraction, transformation and loading using Informatica Power Center 10.2/9.6 (Workflow Manager, Workflow Monitor, Source Analyzer, Target Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager)

●Experience in AGILE Methodologies - Sprint Planning and Scrum and also waterfall model.

●Experience in developing complex Mappings, Reusable Transformations, Sessions and Workflows using Informatica ETL tool to extract data from heterogeneous sources like Flat Files, Oracle then load into common target area such as Data warehouse.

●Worked with Scheduling tools like Robot scheduler and DAC.

●Experience in Validating ETL Tools like Informatica Power Center.

●Worked on dimensional modeling (Star and Snowflake schema, Kimball methodology) ·

●Worked on data modeling, Data warehousing, and data migration significant expertise in the design of Conceptual, logical and physical data model using relational data modeling tools.

●Proven track record in troubleshooting of Informatica jobs and addressing production issues like performance tuning and enhancement and resolving priority tickets.

●Experience in Database Management, Data Mining, Software Development Fundamentals, Strategic Planning, Operating Systems, Requirements Analysis, Data warehousing, Data Modeling and Data Marts.

●Area of expertise encompasses Database designing, ETL phases of Data warehousing. Execution of test plans for loading the data successfully.

●Strong database skills with Oracle (performance tuning, complex SQL).

●Monitoring the jobs and fixing the jobs if any job fails in production.

●Worked with reporting tools like tableau and obiee. Created various reports based on business requirements.

●Experience in designing repository such as business layer, physical layer and presentation layer.

●Created aliases, hierarchies and calculations and various reports such as funnel chart, pareto chart, waterfall chart, bar and line chart etc. based on requirements using tableau and obiee.

●Engaged in creating dashboards and stories using tableau and obiee.

●Ability to meet deadlines and handle pressure in coordinating multiple tasks in a work/project environment. Strong communication, organizational and interpersonal competencies along with detail oriented and problem-solving skills in the technology arena.

●Ability to lead and mentor different teams across different time zones and assigning tasks and process in implementing the duties.

●Assign work to the team and distribute the tasks based on the bandwidth.

●Self -motivated and committed person with excellent interpersonal communication skills and willingness to learn new tools, concepts and environments.

PROFESSIONAL EXPERIENCE

Anthem Corporation-VA July 2022-Present

Sr. Informatica Developer/Data Analyst

The PA Office of Long-term Living (OLTL) has requested initiative to support Dual Eligible Special Needs Plans (DSNPs) analysis to identify areas where policies or procedures can improve member medical health through better care coordination between Medicaid and Medicare. The solution will receive DSNP encounters files from MCOs and load them into EDW and build a Cognos Ad Hoc Reporting package.

Responsibilities:

●Extensively used Informatica Client Tools – Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer, Mapplet Designer, Informatica Repository.

●Extensively work on SQL and unit testing and advance SQL ·

●Worked on oracle change partition techniques. Oracle performance tuning.

● Used various transformations like Filter, Router, Sequence Generator, Expression, Joiner, and Update Strategy.

●Developed Mappings between source systems and Warehouse components.

●Developed Mapplets, Mappings and configured sessions.

●Extensively used almost all the transformations ·

●Created reusable transformations and Mapplets to use in multiple Mappings.

●Used debugger to test the data flow and fix the mappings.

●Analyzed newly converted data to establish a baseline measurement for data quality in Data Warehouse.

●Performed Data Manipulation using basic functions and Informatica Transformation.

●Developed Mappings/Transformation/Mapplets by using Mapping Designer, Transformation Developer and Mapplet Designer in Informatica Power Center.

●Created and monitored Sessions/Batches using Informatica Server Manager / Workflow Monitor to load data from Oracle, SQL Server, flat files into target Oracle database.

●Responsible f or production support for the ongoing jobs they run every day.

●Used Parameter files to initialize workflow variables, mapping parameters and variables.

●Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

●Modified existing mappings for enhancements of new business requirements.

●Used Debugger to test the mappings and fixed the bugs.

●Based on jira tickets, worked on tasks and stories assigned to me based on finance operations.

●Deployed into higher environments using ctu and service now after development is completed on lower environment.

●Experience on bitbucket and git bash for uploading scripts (versioning).

Environment: Informatica PowerCenter 10.x, Oracle 10G, Flat files, SQL Developer, Facets, Git, Bitbucket, CTU, SVN

Chubb - Jersey City, NJ Dec 2019 – Dec 2021

Sr. Informatica Developer/Data Analyst

The goal of the engagement is to develop an actionable roadmap for Personal & Business Insurance (P&BI) Business Insights to leverage its data in order to gain more insight into SME, Residential and Auto line of business (LOB). This project will enable executives, line managers and notably data analyzers to evaluate the Small & Medium Sized enterprise (SME), Residential and Auto portfolio to monitor business trends throughout AOG countries and product lines in a timely fashion. This will be achieved with strong emphasis on quality data.

Responsibilities:

●Gathered Requirements from the client.

●Involved with architect in designing the project.

●Prepared technical design document.

●Prepared the source to target mapping sheet.

●Provided to technical guidance to the team.

●Worked on data profiling for null and duplicate issues.

●Tuned Informatica Mappings.

●Extensively used Informatica Client Tools – Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer, Mapplet Designer, Informatica Repository.

●Used various transformations like Filter, Router, Sequence Generator, Expression, Joiner, and Update Strategy.

●Developed Mappings between source systems and Warehouse components.

●Developed Mapplets, Mappings and configured sessions.

● Extensively used almost all the transformations.

●Created reusable transformations and Mapplets to use in multiple Mappings.

●Used debugger to test the data flow and fix the mappings.

●Analyzed newly converted data to establish a baseline measurement for data quality in Data Warehouse.

●Performed Data Manipulation using basic functions and Informatica Transformation.

●Developed Mappings/Transformation/Mapplets by using Mapping Designer, Transformation Developer and Mapplet Designer in Informatica Power Center.

●Creating mappings in IICS and loaded data into Azure sql database and netezza data base nd verified the performance.

●Migration of existed workflows on premise to cloud using IICS.

●Worked on performance tuning the jobs.

Environment: Informatica PowerCenter 9.1,9.4,10, IICS,CDI,CAI,Oracle 10G, Flat files, SQL plus, UNIX, Robot job scheduler, Netezza,Azure SQL DB

Riverbed- Bangalore, INDIA Jun 2018-Sept 2019

Sr. Informatica Developer

This project focuses on complete service contracts made by the clients. Datamart was created which purely focuses about assets, contracts, customers, bookings end to end relationship. Our goal of helping people make life’s most important decisions. We have used sources oracle and flat file and build data mart.

Responsibilities:

●Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

●Created mapping documents to outline data flow from sources to targets.

●Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

●Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

●Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

●Developed mapping parameters and variables to support SQL override.

●Sourced transaction data being staged by Golden Gate tool CDC (Change Data Capture) through Informatica Power center and loaded into the target.

●Created mapplets to use them in different mappings.

●Created various DOS Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows using DAC.

●Developed mappings to load into staging tables and then to Dimensions and Facts.

●Used existing ETL standards to develop these mappings.

●Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

●Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Environment: Informatica PowerCenter 10.2, Oracle 10G, Flat files, SQL plus, DAC

Ally Financial, Detroit, Michigan Nov 2016 – May 2018

Informatica Developer

This project is involved in the development of data warehouse for BOM based on four kinds of data

marts such as Accounts, Loans, Credit Cards and Insurance. Each data marts represent a collection of

data pertaining to a single business process. In loan data marts BOM Involved in disbursal of loan

amounts for various purposes like: Personal Loan, Educational Loan, Vehicle Loan, Housing Loan etc. The

company requires different levels of analysis regarding loan amount, type of loans, type of customers,

type of payment schedules, interest rates (variable or fixed), defaulters list and the penal interest

calculations etc. They needed a Data Warehouse so that they could maintain historical data and central

location for integration analyze business in different locations, according profit areas, which could serve

a purpose of a DSS for decision makers.

Responsibilities:

●Designed the Mapping Technical Specifications on the basis of Functional Requirements.

●Mainly involved in ETL development.

●Creating the mapping documents to map the source system data to the Data Model

●Design a mapping to process the incremental changes that exits in the source table.Used various transformations like Unconnected /Connected Lookup, Aggregator, Joiner, Stored Procedure.

●Developed mapping parameters and variables to support SQL override.

●Responsibilities included designing and developing complex mappings using Informatica power center 8.6.1 and 9.1

●Created pre-session and post-session shell scripts and email notifications.

●Worked with UNIX shell scripts extensively for job execution and automation.

●Performed incremental aggregation to load incremental data into Aggregate tables.

●By eliminating the unwanted data in worksheets by using filters.

●Designed and deployed reports with Drill up and Drop down menu option and Parameterized

and Linked reports using Obiee.

●Worked on the technical and functional documents required for the end user..

●Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.

●Tuned the session parameters to increase the efficiency of the sessions in workflow manager. · Fine-tuned Informatica jobs by optimizing all transformations.

Environment: Informatica PowerCenter 9.X, Oracle, Flat files, Oracle Developer, UNIX,DAC

IGATE Global Solutions - Chennai, INDIA April 2014 – Aug 2016

Informatica Developer

The Mosaic Company Enterprise Data Warehouse (EDW) Program intends to provide a reliable, cost effective and flexible Data Warehouse and Business Intelligence platform to serve data analytics and reporting needs across multiple business divisions and functions.

Mosaic pre-built asset analytics solution – Asset SMART that addresses key challenges faced by Mosaic in maintenance and repair area and set the stage for being able to scale beyond the scope of the current Maintenance Analytics Reporting program.

Responsibilities:

●Experience and created Informatica mappings to populate data into fact tables and dimension tables.

●Developed complex mappings using various transformations.

●Used various transformations like Unconnected /Connected Lookup, Aggregator, Joiner, Stored Procedure.

●Developed mapping parameters and variables to support SQL override.

●Responsibilities included designing and developing complex mappings using Informatica power center 8.6.1 & 9.1.

●Created pre-session and post-session shell scripts and email notifications.

●Worked with UNIX shell scripts extensively for job execution and automation.

●Performed incremental aggregation to load incremental data into Aggregate tables.

●Used PMCMD commands of Informatica in UNIX to schedule sessions and jobs.

● Created scripts for performing database level query joins, functions.

●Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.

●Tuned the session parameters to increase the efficiency of the sessions in workflow manager. · Fine-tuned Informatica jobs by optimizing all transformations.

Environment: Informatica PowerCenter 9.X, Oracle, Flat files, Oracle Developer, UNIX

EDUCATION

●Bachelor’s Degree in Electronics and Communication Engineering from JNTU, Hyderabad

SKILLS:

ETL: Informatica Power Center 10.4/9.6/9.5/9.1/8.0,IICS

Databases: Oracle 11/10g, Netezza, Sql server,Azure Synpase DWH

Data modeling: Star schema, Snowflake schema.

Environment: Windows, Unix,

Scheduling Tool: DAC and robot job Scheduler

Reporting Tools: Tableau and Obiee



Contact this candidate