Post Job Free

Resume

Sign in

Manager Data

Location:
Mumbai, MH, India
Posted:
June 28, 2016

Contact this candidate

Resume:

Saranya

Senior Informatica Developer

Professional Summary:

* ***** ** ** ********** in the Analysis, Design, Development, Testing, Implementation and documentation of application software in Data Warehouseing and ETL Process.

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica Power Center.

Extensive experience in ETL Process with client side technologies like Erwin, Informatica powercenter, Unix, Oracle SQL PL/SQL, Toad, Teradata, FTP & Quality Center etc.

Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica Power Center tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor, and Informatica Administration Console).

Experience in designing/developing complex mapping using transformations like Source Qualifier, Router, Filter, Expression, Sorter, Aggregator, Normalizer, Joiner, Sequence Generator, Connected and Unconnected Lookup and Update Strategy.

Excellent experience in creating reusable and non-reusable tasks such as Session, Command, Email, Event wait /raise, timer, assignment etc & configuring them in workflow using workflow managaer.

Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views, and Indexes in distributed environment.

Extensive Experience on Performance tuning to optimize cycle run time by identifying bottle necks which exists at database level or at informatica level or at network level, once bottlenecks were identified respective tuning measures were implemented to improve production cycle performance.

Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.

Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies.

Robust Knowledge of ETL Process, Dimensional Data Modeling, Slowly Changing Dimensions and data warehouse concepts, Software quality procedures, standards and project documentation.

Worked extensively in various stages of SDLC like design, development, testing (QA, UAT, and Regression) and production support.

Experience in Insurence, Banking and Telecom domins.

Strong experience in writing UNIX Shell scripts, SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.

Excellent Experience in Unix Shell Scripting to write parameterized shell scripts to run sessions & batches.

Worked with cross-functional teams such as QA, DBA and Environment teams to deploy code from development to QA and from QA to Production server.

Technical Skills:

Operating System : WINDOWS 98/2000/NT/XP, IBM-AIX 5.3, HP_UX, Linux.7.1/7.2/8.

Language : SQL, PL/SQL, C, C++, JAVA, Shell(bash,ksh), HTML, COBOL, Control -M & Perl Scripts.

RDBMS : ORACLE 8i/9i/10g/11, DB2/AIX64 8.2.3/9.5.3, Sybase, MS SQL Server, Teradata.

ETL Tools : Informatica Power Center 9.1.0 (Designer, Repository Manager, Workflow Manager and

Workflow Monitor) /8.6.1/8.1.1/7.1.2, Power Exchange 9.5.1.

DataModel Tools : ERWin 4.1, Microsoft Visio.

OLAP tools : Business Objects, Cognos

Database Tools : Toad 7.6/9/10g, PL/SQL Developer, DB2 Load Utility, SQL plus.Teradata.

GUI : D2K, Visual Basic 6.0, Power Builder 5/6.5/7.

Projects Profile:

Client: Kaiser Permanente, Lake Oswego-OR

Role: Senior Informatica Developer Jun 2015 – till date

Responsibilities:

Work with Solution architects/Data modeling team to come up with logical data model. Verification of physical data model and database objects created.

Extracting source data from plan factors and plan characteristics of the CSR eligible individual plans for all regions.

Extracting the membership data from MDW for the CSR eligible individual plans

Extracting claim details from CDW, DSS, CA Diamond for the members enrolled in the CSR eligible individual plans. This includes both the medical claims and pharmacy claims for all the regions.

Processing Claims, memberships and reference datasets to provide summarizations and grouping that will be used to calculate CSR subsidies, Plan level and policy level summaries.

Responsible for develop ETL Jobs using informatica power center.

Transformed business requirements into technical specification and then source-to-target mapping documents.

Created Informatica Mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.

Experienced with Informatica PowerExchange for retrieving and loading data from Mainframe systems; Worked with PowerExchange Navigator to import source tables; Performed data masking and updated the respective tables; Created data registration maps for PWX reading real time data.

Worked on Audit frame work and error handling mechanisms.

Worked on workflow concurrency for multiple regions.

Loading the data from multiple flat files like, fixed width, pipe delimited files to staging environment.

Used complex sql queries to extract the data from up streams.

Validation of upstream data before loading into staging and harmonized layer.

Prepared the Test case and Test Scripts for Unit testing.

Provide support to the QA/UAT team and Design and review test plans with users

Environment: Informatica Power Center 9.5, Informatica PowerExchange 9.5.1, Oracle 11g, Sql, Pl/sql, Toad 9.6.1, UNIX Shell Scripting, TWS,Sql Plus, DB2,Mainframe,FlatFiles,Toad,Informatica MDM.

Client: GE OIL & GAS, Houston, TX

Role: Informatica Developer Dec 2013– May 2015

Responsibilities:

Analyzed the Specifications and identifying the source data needs to be moved to data warehouse.

Transformed business requirements into technical specification and then source-to-target mapping documents.

Involved in Database design, entity relationship modeling and dimensional modeling using Star schema.

Worked with Source Analyzer, Data Warehouse Designer, Repository Manager, Workflow monitor, Mapping Designer, Mapplet, and Transformation Developer in Informatica designer.

ETL mappings were developed to perform tasks like validating file formats, business rules, database rules and statistical operations.

Used Informatica Designer to create Reusable transformations to be used in Informatica mappings and Mapplets.

Created test cases for the above projects in providing error free solution. Monitored Workflows and sessions using Workflow Monitor.

Debug through Session logs and fix issues utilizing database for efficient transformation of data.

Prepared Mapping documents, Data Migration documents, and other project related documents like mapping templates and VISIO diagrams.

PL/SQL programming to create Oracle stored procedures and calling the procedures from Informatica sessions for data manipulation and error handling.

Schedule of workflow by using Autosys Job scheduling.

Used shell scripts for automating the execution of maps.

Used Metadata Manager to collect and link metadata from diverse relational databases, integration processes, and mainframe systems, into a central catalog.

Experience in Working with Oracle PL/SQL, UNIX Shell Scripting in UNIX environment.

Develop technical documents in ease of improvements on existing code.

Involved in data conversion, migration, integration quality profiling tasks.

Environment: Informatica Power Center 9.1.0/8.6, Oracle 10g, Teradata, Sql, Plsql, UNIX, Informatica MDM.

Client: BlueCross Blue Shield, MI June 2012 – Nov 2013

Role: Sr.ETL Informatica Developer, Data quality project

Blue Cross Blue Cross Blue Shield of MI is a Non-profit organization and the biggest Health Care Company in Michigan for more than 70 years. BCBSM is offering access to health care coverage for everyone regardless of circumstances. Data quality project is to fix the data quality issues in the EDW (Enterprise Data warehouse) and it’s Downstream which includes like 20 Data marts.

Responsibilities:

Analyze the Data Quality issue and find the root cause for the problem and designing the solution for the problem with the Business peoples approval .Presenting the cases which are analyzed as tickets for the developers to develop the new requests and working as one among them.

Analysis of the downstream flow on any changes to the Enterprise Data warehouse.

Used to work with the Business Analysts involving in analyzing the issues in the given ticket and finding the proper solution to fix the issue. Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes

Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer and Mapping Designer.

Created various Transformations like Expression, Joiner, Lookups, Filters, Routers, and Update Strategy etc.

Involved in the Development of different mappings and tuned for better performance.

Extensively used ETL to load data from multiple sources as flat files into as target database.

Code walkthrough and Review of documents which are prepared by other team members.

Prepared UTP (Unit Test Plan) and Test with different unit test conditions, fixing of defects if any are found during System Testing.

Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.

Worked on preparing a full-fledged documents called BIDR

Worked in developing the Downstream Data marts like CDM, IDM and BDR which are used for the Reporting. Data is extracted from the EDW and loaded in to these Data marts according to the Business requirements.

Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager.

Worked on PowerExchange bulk data movement processes by using PowerExchange Change Data Capture (CDC) method.

Worked extensively on different types of transformations like source qualifier, SQL transformation (Both Query and Script modes), Normalize, expression, filter, aggregator, rank, update strategy, lookup (connected & unconnected lookups), stored procedure, sequence generator and joiner. Used Telnet and WINSCP tool to create and view the parameter files and other source files in Unix in DEV, Testing and Production Environments

Used UNIX Shell Scripting to invoke Sessions in the workflow manager.

Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.

Environment: Informatica Power Center 8.1/8.5/8.6, DB2 Power exchange 8.1, TOAD, PL/SQL, Oracle 11i/10g/9i, Flat Files, Business objects.

Client: ABBOTT, Abbott Park, IL Jun 2010 to April 2012

Role: Sr. Informatica Developer

Responsibilities:

Involved interactively with the business units to understand the requirements and gather the feedback during the design, development and testing of solutions.

Worked on various data sources such as Oracle, DB2 UDB and flat files.

Extracted the data from various data source systems into the Landing Zone area by creating the Informatica mappings using the Teradata fast Loader Connections.

Wrote SQL Queries as part of Data validation in the Target tables

Created the Scripts to process the business logic from the Landing Zone to the Common Staging Area(CSA)

Created the Email task notifications to notify the error messages and also used the command task to run the BTEQ scripts in the workflow monitor.

Created Dimension Tables and Fact Tables based on the warehouse design.

Wrote Triggers and Stored Procedures using PL/SQL for Incremental updates.

Integrated various sources in to the Staging area in Data warehouse to Integrate and for Cleansing data.

Having brief practical knowledge on Power Exchange Navigator, Power Exchange Bulk Data movement, power exchange change data capture.

Worked on utilities like FLOAD, MLOAD, FEXP of Teradata and created batch jobs using BTEQ.

Created mapplets and reusable transformations.

Creating sessions and batches to run with the logic embedded in the mappings using Informatica Power Center Workflow Manager.

Used Informatica features to implement Type I, II changes in slowly changing dimension tables.

Conducting debugging sessions and fixing invalid mappings.

Designed and developed UNIX Scripts to automate the tasks.

Used the different types of transformations in Informatica such as Source Qualifier, Look up, Router, Joiner, Union, Aggregator etc.

Created necessary Repositories to handle the metadata in the ETL process.

Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger Wizard.

Strong in UNIX Shell and PERL scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like CRON tab, Maestro, Control-M.

Done the Unit testing after successfully loading the data into Landing Zone (LZ) area.

Environment: Informatica PowerCenter 8.6, Oracle 10g, Flat Files, Toad, SQL, PL/SQL, UNIX Scripting, Teradata13.1, Teradata SQL Assistant, MLOAD, FASTLOAD, TPUMP.

Client: Northwestern Mutual, Milwaukee, WI Mar 2008 to May 2010

Informatica developer

Responsibilities:

Analyzed all functional specifications and map documents and performed troubleshooting on all development processes.

Exported mappings and workflows from the repository manager and then imported them back to the destination folder. Source, target and transformations were exported in the xml format.

Modified, designed and developed the existing mappings using transformation logics like lookup, Router, Filter, Expression, Aggregator, Joiner and Update Strategy according to the business requirement.

Extracted data from source systems like DB2 and flat files and loaded into the data warehouse and flat files.

Created tables and flat files as per the requirement and imported them to the repository.

Implemented Type I & Type II slowly changing dimensions in Informatica mappings.

Integrated new mapplets into the existing mappings according to the client specifications.

Created sessions, workflows and worklets for proper execution of mappings using workflow manager.

Performed unit testing, configuring and troubleshooting all Informatica processes as required and validated the results with end users.

Implemented performance tuning logic on Targets, Sources, mappings, sessions for maximum efficiency and performance.

Extensively used Autosys as a job scheduler tool to schedule UNIX scripts and the informatica jobs.

Environment: Informatica Power Center 9.1/8.6, DB2, Flat Files, UNIX Shell Scripting, Autosys



Contact this candidate