Post Job Free

Resume

Sign in

informatica developer

Location:
Elkridge, MD
Salary:
80000
Posted:
January 10, 2020

Contact this candidate

Resume:

Vandana Suresh MD

Informatica Developer

Email: ada9oe@r.postjobfree.com

Cell: 614-***-****

PROFESSIONAL SUMMARY:

Around 7 years of work experience in Information Technology as Informatica Developer with strong background in ETL Data warehousing experienced using Informatica Power Center 10x/9x/8x.

Experience in using Informatica Power Center Transformations such as Source Analyzer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Expertise in design and implementation of Slowly Changing Dimensions (SCD) type1, type2, type3.

Experience in loading data, troubleshooting, Debugging mappings, performance tuning of Informatica (Sources, Targets, Mappings and Sessions) and fine-tuned transformations to make them more efficient in terms of session performance.

Database experience using Oracle 11g/10g/9i, MS SQL Server 2008/2005/2000 and MS Access.

Experience in UNIX Operating System and Shell scripting.

Working knowledge of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLTP and OLAP.

Experience in data mart life cycle development, performed ETL procedure to load data from different sources into Data marts and Data warehouse using Informatica Power Center.

Used Debugger in Informatica Power Center Designer to check the errors in mapping

Experience in writing complex SQL queries.

Excellent skills in fine tuning the ETL mappings in Informatica.

Extensive experience using database tool such as SQL *Plus, SQL *Developer and TOAD, Salesforce data loader and good knowledge on Netezza.

Experience in automating the Informatica jobs using Tidal.

Good working knowledge of various Informatica designer transformations like Source Qualifier, Expression, Filter, Router, Lookup, Aggregator, Normalizer, Rank, Joiner and Update Strategy.

Effective working relationships with client team to understand support requirements, and effectively manage client expectations.

Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.

EDUCATION:

Masters in Computer Application from Anna University, Coimbatore, India.

TECHNICAL SKILLS:

ETL Tools

Informatica 10.x/9.x/8.x/7.x (Power Center/Power Mart) (Designer, Workflow Manager, Workflow Monitor).

Databases

Oracle 12c/11g/10g, MS Access, Netezza.

Languages

PLSQL, SQL, C, C++, Data Structures, Unix Shell Script, Python basics

Tools

Toad, SQL* Developer, Salesforce dataloader, Tidal Scheduling tool.

Operating Systems

Windows Server, NT/2008/2003/XP/7/8/10, UNIX.

Certifications:

AWS Solution architect associate (valid from november 2019 - november 2022)

PROFESSIONAL EXPERIENCE:

Project Phoenix May 2019 - Oct 2019

TDA, MD

Informatica Developer

Project description: The aim of this project is to convert salesforce classic model to financial services cloud by bringing the data from salesforce to EDW. As part of this project, we had to bring retail and institutional data from MDM to EDW. The data from salesforce objects and MDM (Oracle), netezza is brought into staging and then into EDW. The data is then replicated in EW and reports are generated.

Responsibilities:

Extracted Data from the source Netezza, salesforce, oracle and developed mappings, sessions and workflows using Informatica Power Center 10.1.

Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Warehouse database.

Created mappings using transformations like source qualifier, expression, filter, router, lookup, and update strategy.

Applied slowly changing dimensions Type I and Type II according to the business requirements.

Extensively worked in Informatica Performance Tuning and thereby improving the load time.

Apply business rules using complex SQL and validation rules to check data consistency.

Involved in requirement gathering, design, testing, project coordination and migration.

Effectively understood session error logs and used debugger to test mapping and fixed bugs.

Worked with the team to schedule the jobs that run every day using tidal.

Fine-tuned ETL processes by considering mapping and session performance issues.

Responsible for handling large volume data conversions, data cleansing and following data delivery standards.

Used knowledge of data and systems to proactively contribute to report design process.

Used in-build UNIX scripts to execute workflows.

Maintained the proper communication between other teams and client.

Environment: Informatica 10.1, Netezza, Oracle, UNIX, Windows, Salesforce dataloader, Tidal scheduler.

Institutional Datamart July 2018 - Mar 2019

TDA, MD

Informatica Developer

Project description: Institutional datamart contains information about various institutional details, like revenue,trade,account,advisor etc of an trading institution. The aim of this project is to bring in the data related to institution and store it in EDW. The data is then replicated from EDW to EW and reports are generated by business..

Responsibilities:

Extracted Data from the source Netezza and developed mappings, sessions and workflows using Informatica Power Center 10.1.

Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Warehouse database.

Created mappings using transformations like source qualifier, expression, filter, router, lookup, and update strategy.

Applied slowly changing dimensions Type I and Type II according to the business requirements.

Extensively worked in Informatica Performance Tuning and thereby improving the load time.

Apply business rules using complex SQL and validation rules to check data consistency.

Involved in requirement gathering, design, testing, project coordination and migration.

Effectively understood session error logs and used debugger to test mapping and fixed bugs.

Worked with the team to schedule the jobs that run every day using tidal.

Fine-tuned ETL processes by considering mapping and session performance issues.

Created session, event, command, control, decision and email tasks in workflow manager.

Responsible for handling large volume data conversions, data cleansing and following data delivery standards.

Used knowledge of data and systems to proactively contribute to report design process.

Used in-build UNIX scripts to execute workflows.

Maintained the proper communication between other teams and client.

Environment: Informatica 10.1, Netezza, UNIX, Windows, Tidal scheduler.

Annuities Information Factory(AIF) 2.0 April 2017 - July 2018.

Prudential, NJ

Sr. Informatica Developer

Project description: Annuities Information Factory (AIF) contains the Annuities information, which is build to facilitate the analysis and reporting requirements for the business through Data Warehouse and Data Marts. As per of AIF 2.0, the Warehouse architecture were redesigned to bring an Operational Data Store (ODS) which gets the data from the Source system on Real time basis and feeds the Data Warehouse. This improves data availability for the downstream consumers and business users.

Responsibilities:

Extracted Data from the source Oracle and developed mappings, sessions and workflows using Informatica Power Center 10.1.

Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Warehouse database.

Prepared design specification documents as per the inputs received from the architect and the business analyst.

Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.

Extracted data from source database which is VPAS, that contains annuities information.

Developed Informatica ETL mappings, sessions and workflows based on the technical specification document.

Created mappings using transformations like source qualifier, joiner, aggregator, expression, filter, router, lookup, and update strategy.

Applied slowly changing dimensions Type I and Type II according to the business requirements.

Extensively worked in Informatica Performance Tuning and thereby improving the load time.

Apply business rules using complex SQL and validation rules to check data consistency.

Involved in requirement gathering, design, testing, project coordination and migration.

Effectively understood session error logs and used debugger to test mapping and fixed bugs.

Worked with the team to schedule the jobs that run every day using Autosys scheduler.

Fine-tuned ETL processes by considering mapping and session performance issues.

Created workflow and other tasks to schedule the loads at required frequency using Informatica scheduling tool.

Created session, event, command, control, decision and email tasks in workflow manager.

Responsible for handling large volume data conversions, data cleansing and following data delivery standards.

Written unit test cases for the mappings.

Used knowledge of data and systems to proactively contribute to report design process.

Used the knowledge on UNIX scripts to move files from one location to another.

Maintained the proper communication between other teams and client.

Environment: Informatica 10.1 hotfix2, Oracle 12c, UNIX, Windows, Autosys scheduler.

Centralized Licensing Warehouse Jun 2016 – April 2017.

Nationwide Insurance, OH

Sr. Informatica Developer

Project description: Nationwide is a one of largest Insurance provider in North America. It deals with various type of licensing for its agencies and agents at individual and firm level. As part of Centralized Licensing initiative, its building a warehouse where it will be storing all the licensing information into one warehouse so that it can keep track of licensing information. As part of Centralized Licensing Warehouse initiative, we are bringing licensing information from various sources and validating them against government’s RegED systems. As part from validating we do store the data in a warehouse where the users can keep track of licensing at various levels. Reports are built in Java and Cognos for various purposes.

Responsibilities:

Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica.

Created / updated ETL design documents for all the Informatica components changed.

Developed Informatica ETL mappings, sessions and workflows based on the technical specification document.

Analyze trades/Licenses from various Source systems like Denado/AMS/AMF/PDS and integrate with existing Nationwide data to help with licensing analysis.

Created the ETL design documentation, Mapping document, Migration document, Test cases and set the standards for development and testing.

Actively involved in analysis, development, testing and performance tuning.

Implemented Slowly Changing Dimensions.

Involved in Design review and code review.

Extensively worked in Informatica Performance Tuning and thereby improving the load time.

Created Index, Partitioning, Analyzed Tables and improved performance.

Created Informatica Mappings using Mapping Designer to load the data from various sources using different transformations like Source Qualifier, Normalizer, Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Router, Sequence Generator, and Update Strategy transformations.

Used Informatica job scheduler to schedule UNIX shell scripts and Informatica jobs.

Actively supported in Unit Testing, Integration Testing and Performance Testing of ETL.

Responsible for handling large volume data conversions, data cleansing and following data delivery standards.

Apply business rules using complex SQL, PLSQL and validation rules to check data consistency.

Performed Data validation and massaging to ensure accuracy and quality of data.

Created Unix Scripts to Move files from one location to other and to create Parameter files.

Used knowledge of data and systems to proactively contribute to report design process.

Environment: Informatica 9.5, Oracle 11g, Toad, SQL, PL/SQL, UNIX, Windows, Informatica scheduler.

Exclusive Agent Commissions Sep 2015 - Jun 2016.

Nationwide Insurance, OH

Informatica Developer

Project description:

Exclusive Agent Commission project is to consolidate the EA commission processing within the Allied Policy Systems (Service Advantage & Integrated Commercial Applications) by taking the rules that are spread across the various sub-systems and centralizing them into a “central” set of component(s).This will reduce the implementation time and cost for commission changes and simplify the EA Commission support. This develops common understanding of rules to be leveraged for single implementation of commission processing.

Independent Agent Commission project is to consolidate the IA commission processing within the Allied Policy Systems (Service Advantage & Integrated Commercial Applications) by taking the rules that are spread across the various sub-systems and centralizing them into a “central” set of component(s).This will reduce the implementation time and cost for commission changes and simplify the IA Commission support.

Direct Agent Commission project is to consolidate the Direct commission processing within the Allied Policy Systems (Service Advantage & Integrated Commercial Applications) by taking the rules that are spread across the various sub-systems and centralizing them into a “central” set of component(s).This will reduce the implementation time and cost for commission changes and simplify the IA Commission support. This develops common understanding of rules to be leveraged for single implementation of commission processing.

Responsibilities:

Based on the requirements, created Technical design specification documents for ETL Process.

Developed mappings and mapplets using Informatica Designer to load data into ODS from various source systems.

Used Informatica Designer to import the sources, targets, create various transformations and mappings for extracting, transforming and loading operational data into the centralized warehouse from ODS.

Used various transformations such as expression, filter, rank, source qualifier, joiner, aggregator and Normalizer in the mappings and applied surrogate keys on target table.

Used the Informatica Server Manager to register and monitor the server, create and run the sessions/batches for loading the data using the earlier created mappings.

Created mapplets and reusable transformations.

Created Workflow and Tasks to schedule the loads at required frequency using Workflow Manager.

Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.

Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing.

Environment: Informatica 9.1x/8.6x, Oracle 11g, Toad, Windows, Informatica scheduler.

Enterprise Data warehouse Nov 2012 –Mar 2014.

Tata Consultancy Services, India

Informatica Developer

Project description:

The Enterprise Data warehouse (EDW) is system we all the insurance data will be maintained of the customers. The Cigna is offering different types of Insurance for their customers like Health, Auto, Property, Rental etc. To maintain all these transactions history a centralized application is built that’s called EDW.

Responsibilities:

Gathered user Requirements and designed Source to Target data load specifications based on business rules.

Used Informatica Power Centre 9.0.1.for extraction, loading and transformation (ETL) of data in the datamart.

Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.

Developing several complex mappings in Informatica a variety of Power enter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power enter.

Built complex reports using SQL scripts.

Created complex calculations, various prompts, conditional formatting and conditional blocking etc., accordingly.

Created complex mappings to load the data mart and monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normaliser and Sequence generator transformations.

Ran the workflows on a daily and weekly basis using workflow monitor.

Environment: Informatica 9.0.1, Framework Manager, Transformer, Oracle 11g, TOAD, Windows Server 2008, UNIX.

Aetna May 2011 – Nov 2012.

Tata Consultancy Services, India

Informatica Developer

Project description:

The project involves the testing of the Aetna Data warehouse (DWH). Data warehouse mainly deals with the reporting. The business objective of the Data warehouse is to use the previous as well as current to make the strategic decisions to help and support the business. The changes are being decided by the NCQA (National Committee for Quality Assurance), so the change in the NCQA criteria’s will cause the project changes to meet the guidelines decided by NCQA. This deals with testing of the different applications available under Data Warehouse. This involves understanding requirements, preparing plan, test scripts, execution, defect reporting and tracking. This also involves system, regression and production support checkout testing.

Responsibilities:

Participated in all phases of system development life cycle from requirements gathering to deployment of the finished system into production followed by maintenance and knowledge transfer tasks.

Gathered requirements by analyzing source systems and identification of business rules through regular requirements gathering sessions with business users and other support teams for various OLTP and OLAP systems.

Used Power Center Designer to design the business process, grain of the data representation, dimensions and fact tables with measured facts.

Extensively used STAR and SNOWFLAKE schema models in design.

Extensively used Transformations like lookup, Router, Filter, Joiner, Source qualifier, Aggregator, Update strategy, Etc.

Involved in performance tuning of sessions that work with large sets of data by tweaking block size, data cache size, sequence buffer length and target based commit intervals.

Developed sessions and batches to move data at specific intervals and on demand using workflow manager.

Participated in deployment planning and in deployment of the system to production.

Facilitated business user smoke testing of the production system by setting up test data.

Involved in production support duties including monitoring of nightly batches.

Responsible for updating business stakeholders and OLTP/OLAP application support teams about the status of various ETL sessions and the impact of failed sessions on data availability.

Environment: Informatica Power Center 9.1, Oracle9i, SQL*PLUS, TOAD, DB2, UNIX, Windows.



Contact this candidate