Post Job Free
Sign in

Data Manager

Location:
United States
Posted:
October 10, 2019

Contact this candidate

Resume:

Anusha

Email: **********@*****.***

Contact: 214-***-****

Informatica Developer

PROFESSIONAL SUMMARY:

Over 6 years of IT experience as Technical consultant for Data Warehouse ETL, experienced in requirement analysis, data analysis, application design, modeling, development, testing and support life cycle of Data Warehouse applications using Informatica9.x, 8.x, 7.x.

Experience in providing Business Intelligence solutions in Data Warehousing using Informatica ETL tool (Extraction, Transformation and Loading).

Good Understanding of Ralph Kimball and Bill Inmon approaches.

Excellent knowledge of OLTP and OLAP System Study, Analysis and E-R modeling.

Designed and developed logical and physical models based on Star and Snow Flake schemas, identification of fact and dimension tables.

Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Played significant role in various phases of project life cycle, such as requirements analysis, functional & technical design, testing, production support, and implementation and scheduling.

Experienced in using Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling / IDQ Developer client, applying rules and develop mappings to move data from source to target systems

Extensively worked on data warehousing and decision support systems with relational databases such as Oracle design and database development using SQL, PL/SQL, SQL PLUS and TOAD.

Highly proficient in writing, testing and implementation of Triggers, stored procedures, functions using PL/SQL.

Experience in installing, configuring and updating Informatica server and Client.

Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server, and Cleanse Adapter in Windows.

Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.

Prominent experience in Data Integration, Extraction, Transformation and Loading data from multiple heterogeneous data source systems like Oracle, SYBASE, SQL Server, DB2, Teradata, XML and Flat files into Data warehouse by using Informatics tool.

Expertise in working with Informatica tools - Power Center Client tools - Designer, Repository manager, Workflow Manager, Workflow Monitor and Server tools – Informatica Server, Repository Server manager.

Worked extensively on various data transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations in Informatica Power Center Designer.

Expertise working on slowly changing dimension type1and type2.

Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations using Informatica debugger.

Well versed with HIPAA, Facets, claim adjustments, claim processing from point of entry to finalizing, claim review, identifying claims processing problems, their source and providing corresponding solutions.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings and sessions levels.

Experience in designing and implementing partitioning to improve performance while loading large data volume.

Hands on experience in UNIX shell scripting.

Strong exposure to working on scheduling tools like TWS, AutoSys and Control-M.

Ensure that user requirements are effectively and accurately communicated to the other members of the development team and Facilitate communications between business users, developers and testing teams.

Excellent problem-solving and trouble-shooting capabilities. Quick learner, highly motivated, result oriented and an enthusiastic team player.

Good interpersonal skills, experience in handling communication and interactions between different teams.

TECHNICAL SKILLS:

Operating System

Windows NT / XP / Vista, Windows Server 2000 / 2003 / 2008, UNIX, Linux.

Specialist Applications & Software

Informatica Power Exchange, Metadata Manager, Informatica Power Center10.2/10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, etc.

Databases

Oracle 11g/10g/9i/8i, SQL Server 2008/2005, DB2 UDB, SYBASE, Teradata, MS- Access.

Database Tools

SQL* Plus, SQL Loader, TOAD, DB2 Import, DB2 Export.

Programming Languages

C, C++, Java, SQL, PL/SQL, Shell scripting.

Database Modeling

Star-Schema Modeling, Snowflakes Modeling, FACT and Dimension Tables, Erwin.

Internet Technologies and Microsoft tools

HTML, XML, MS Office toolset (Word, Excel, Visio, Power point)

Scheduling tools

Autosys, Control-M, TWS

Reporting tools

SSRS, Business Objects, Cognos, OBIEE, Crystal reports

EDUCATIONAL BACKGROUND:

Bachelor’s of Engineering from JNTU Hyderabad, India.

PROFESSIONAL EXPERIENCE:

Client : AON Hewitt (Apr’18 – Till Date)

Location : Deerfield, IL

Role : Sr. Informatica Developer

Responsibilities:

Worked with BA in preparing functional specifications and also involved in user meetings.

Preparation of technical specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining the standards.

Perform impact analysis, identifying gas and code changes to meet new and changing business requirements.

Designed, documented and configured the Informatica MDM Hub to support loading, cleansing of data.

Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.

Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data

Used relational SQL wherever possible to minimize the data transfer over the network.

Identified and validated the Critical Data Elements in IDQ.

Built several reusable components on IDQ using Standardizers and Reference tables which can be applied

Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.

Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.

Used tools in the utility workbench in informatica MDM like Batch group, batch viewer, process server, Audit manager.

Created informatica mappings to extract data from Trizetto Facets 4.51 and 4.3 claims database. Analyzed the Trizetto Facets 4.51 data model and assist the Data architecture team to build claims Canonical data model in the ODS.

Worked with various informatica transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update strategy, Stored procedure, Router and Normalizer etc.

Worked with connected and unconnected stored procedure for pre and post load sessions.

Conducted SQL testing for database sources for insert and update timestamps, Counts, Data definitions and Error logic for Facts and Reference tables.

Developed sequential, concurrent sessions and validated them. Scheduled these sessions by using Workflow manager.

Extensive use of SCM tool for the maintenance and migration of ETL code across different repositories like Dev, QA and PROD.

Scheduling of the developed workflows of Informatica using Tivoli by making use of calendar and in build features of Tivoli.

Created SSRS reports for the business users daily reporting of the inpatient and outpatient daily census data.

Created test plans, test strategies; setup the test environment, co-ordinate with the business analysts and business users for creating test data to perform extensive testing.

Involved in production support and resolving the production issues promptly by attending the high priority issues immediately providing best customer support.

Environment: Informatica Power Center 9.6.1, Informatica Power Exchange, Oracle 11g/10g, Teradata 15.10, MDM, IDQ, Trizetto Facets 4.51/4.31, MS SQL server 2005/2012, SQL, PL/SQL, WinScp, HummingBird, IBM Tivoli Work Load Scheduler, SSRS.

Client : Kroger (Sep’16 – Apr’18)

Location : Cincinnati, OH

Role : Informatica Developer

Responsibilities:

Assisted in creating Physical models and used Erwin for Dimensional Data Modeling.

Worked with flat files, XML, other RDBMS databases to load data.

Involved in loading the data from Source Tables to ODS (Operational Data Store) Tables using Transformation and Cleansing Logic using Informatica.

Worked with Informatica tools to create mappings, mapplets and Reusable Transformations to load data from various sources to target.

Created Mappings using Source Qualifier, Aggregator, Filter, Joiner, Sorter, Lookup, Update Strategy, Router, Sequence Generator and Stored procedure transformations.

Developed Slowly Changing Dimensions Type-I, Type-II mappings.

Created, Configured and Scheduled the Sessions and Batches for different mappings using workflow Manager.

Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ)

Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.

Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.

Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.

Created UNIX script to run batch jobs according to business requirements.

Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.

Created complex scripts to do chucking and massaging of legacy data in staging area of DTS and SSIS package.

Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ

Implemented partitioning and bulk loads for loading large volume of data.

Performed Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like parameter files, variables, partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.

Reduced load time for daily loading process by performing performance tuning.

Used Power Exchange to extract DB2 Source from mainframe server.

Involved in scheduling the workflows using Autosys.

Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.

To check the data quality, unit testing was performed and created test cases and detailed documentation for it.

Used PMCMD commands to execute workflows in non-windows environments.

Created reports using stored procedure. Involved in scheduling, creating snapshots and subscriptions for the reports.

Coordinated with the QA and the reporting team and provided guidance and knowledge on the ETL process

Involved in UAT testing and integration testing process

Environment: Informatica Power Center 9.5, Power Exchange, MDM, IDQ, Oracle 10g, DB2, Flat Files, XML, Mainframe, PL/SQL, SQL BI Suite (SSIS, SSRS, SSAS), SQL*PLUS, Erwin 7.2, Cognos 8, OBIEE, Unix Scripting, Windows NT, TOAD, Autosys.

Client : Clemmons.io (Wufasta) (Aug’15 – Sep’16)

Location : Dallas, TX

Role : Informatica Developer

Responsibilities:

Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into MDR databases.

Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.

Interacted with business users, source system owners and Designed, implemented, documented ETL processes and projects completely based on best data warehousing practices and standards.

Designed, developed Informatica mappings, enabling the extract, transform and loading of the data into target MDR tables on version 9.5

Involved in performance tuning and fixed bottle necks for the processes which are already running in production. Also gained 3-4 hours load time for each process.

Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window or load intervals.

Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables

Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.

Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.

Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.

Created complex mappings using Aggregator, Expression, Joiner transformations including complex lookups, Stored Procedures, Update Strategy and others

Designed and developed table structures, stored procedures, and functions to implement business rules.

Extensively developed shell scripts for Informatica Pre-Session, Post-Session Scripts.

Extensively involved in migration of Informatica Objects, Database objects from one environment to another.

Tested data integrity among various sources, targets and various performance related issues.

Involved in monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 9.5, MDM, IDQ, ETL, Oracle 10g/9i, PL/SQL, Toad, UNIX (HP-UX, Sun Os) and Linux servers.

Client : Talent Sprint (Aug’12 – Nov’13)

Location : Hyderabad, India

Role : Jr. Informatica Developer

Responsibilities:

Translated user requirements into system Responsible for the Business Analysis and requirements gathering, create a road map for the Data warehouse designing.

Involved in Data Modeling sessions using Erwin.

Informatica B2B Data Transformation supports transformations and mappings, via XML, of

most healthcare industry standards.

Tested the enhanced FACETS system, evaluating claims adjudication needs and creating HIPAA-compliant business rules configuration.

Involved in Data Analysis of the OLTP system to identify the sources for extraction of the data.

Creating and modifying Oracle Database tables as per the design requirements and applying Constraints to maintain complete Referential Integrity and creating indexes for performance.

Used Informatica Power Center for ETL extraction, Transformation and loading data from heterogeneous source systems to Staging area.

Developed Mappings for loading data from Multiple Sources Flat Files, and Oracle, into the target Teradata tables.

Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.

Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality

Created Scripts using Teradata utilities (Fast load, Multi load, Fast export, Tpump).

Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations, maintaining workflows.

Created Worklets to run several sessions sequentially.

Determined bottlenecks at various points like targets, sources, mappings, sessions or system, Meta data management and Optimized the Performance. This led to better session performance and achieving high response time.

Extensively worked on Performance Tuning and there by decreased the load time.

Extensively worked with Lookup Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations.

Extensively worked on UNIX Shell scripts and invoked oracle procedures and workflows via shell scripts.

Maintained documents for Design reviews, Engineering Reviews, ETL Technical specifications, Unit test plans, Migration checklists and Schedule plans

Generated reports using Crystal Reports.

Extensively worked with the Debugger for handling the data errors in the mapping designer.

Extensively used Control-M tool to Schedule, Execute and Monitor all ETL jobs.

Environment: Informatica Power Center 8.1.1, MDM, HIPAA/ EDI X12, Facets, Flat files, Erwin, Oracle 10g, SQL server 2005, Teradata, PL/SQL, Control-M, Business Objects 6.5, UNIX, and UNIX Shell Scripting.



Contact this candidate