Post Job Free

Resume

Sign in

Manager Data

Location:
San Francisco, CA
Posted:
January 05, 2016

Contact this candidate

Resume:

Shrenik

Senior Informatica Developer

615-***-****

acs0av@r.postjobfree.com

Summary

Around 7 years of Strong experience in performing ETL operations like Data Extraction, Data Transformation and Data Loading with Informatica Power Center/Power Exchange/Power Mart 9.x/8.x/7.1.

Experience in IT Analysis, Design, Testing, Development and Implementation of client/server applications.

Experience in Development and Design of ETL methodologies using Informatica Power Center in various domains like Health Care, Insurance and Financial sectors.

Worked intensively with Informatica Powercenter Designer. Strong knowledge on Source Analyzer, Warehouse designer, Transformation Developer, Mapplet Designer and Mapping Designer.

Extensively used Informatica Workflow manager and Workflow monitor for creating and monitoring workflows, worklets and sessions.

Extensively worked on Informatica Powercenter Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and XML Source Qualifier.

Created Rules, Profiles and Graphs as part of Data Profiling using Informatica Data Quality Tool (IDQ).

Extensively worked on Informatica Developer and Informatica Analyst tools of IDQ.

Identified the Facts and Dimensions using Erwin Data modeling tool to represent the Star Schema Data Marts.

Strong in database concepts with PL/SQL, Oracle, DB2, NETEZZA and Teradata.

Skills in creating Test Plan from Functional Specification, and Detailed Design Documents.

Experience in writing Test plans, Test cases, Unit testing, System testing, Integration testing and Functional testing.

Experience in UNIX Shell Scripting, Control-M, Autosys and Dollar Universe for scheduling the jobs.

Worked on report analysis and development in OBIEE.

Hands on experience in performance tuning in Informatica mappings, sessions and workflows.

Expertise in Developing Datastage jobs in Parallel Extender using different stages like Transformer, Aggregator, Lookup, Source Dataset, External Filter, Row Generator, and Column Generator.

Experience in Onsite-Offshore model.

Exposure to full project life cycle development for implementation and integration.

Hands on experience in Production Support and Maintenance by resolving issues to meet all Data Warehouse service Level agreements (SLAs).

Education

Bachelor of Technology from JNTU

Technical Skills

Data Warehousing

Informatica Powercenter 9.0/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2, Power Connect for Mainframe/SAP/PeopleSoft/MQSeries, PowerExchange, Informatica PowerMart 6.2/5.1.2/5.1.1/5.0/4.7.2, IBM Data Stage 8.0/7.5.2/7.5.1/7.0/6.0/5.2 (Designer, Director, manager, Administrator, SQL*Loader, Flat Files (Fixed, CSV, Tilde Delimited, XML, COBOL, AS/400)

Dimensional

Data Modeling

Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.

Database and related tools

Oracle 10g/9i/8i/8/7.x, MS SQL Server 2000/7.0/6.5, Teradata, Sybase ASE, PL/SQL, TOAD 8.5.1/7.5/6.2. DB2 UDB

BI & Reporting Tools

Business Objects and OBIEE

Languages

C, SQL, PL/SQL, SQL*Plus, Unix Shell Scripting, Batch Scripting

Operating Systems

Microsoft XP/NT/2000/98/95, UNIX, Sun Solaris 5

PROFESSIONAL EXPERIENCE

Optum Insight - Eden Prairie, MN Aug 2014 – Till date

Role: Senior Informatica Developer

Responsibilities:

Analyzed Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).

Worked in designing/developing mapping using Source Qualifier, Lookup, Filter, Expression, Router, Update Strategy, Rank, Sequence Generator etc.

Analyzed the source data coming from Oracle, Teradata RDBMS and Flat files to create the Source to Target Data Mapping.

Wrote Teradata Utility scripts like FastLoad, MultiLoad, B-TEQ and T-PUMP to load the data.

Worked with DBA’s in optimizing the Major SQL Query’s in the process of performance tuning.

Experience in developing Workflows, Worklets, Sessions and Tasks to effectively manage the Load and also create and control workflows, tasks and sessions.

Extensively used Informatica Data Quality (IDQ) tools for Data Analysis Data Profiling and Data Governance.

Identified and eliminated duplicates in datasets thorough IDQ.

Developed mappings and workflows which supported the mappings in Power Center 9.6.1, IDQ.

Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.

Worked on Query Optimization and ETL performance tuning for improving the performance of the data warehouse.

Extensively worked on Informatica Powercenter 9.6.1 and developed various mappings by using different transformations.

Used Informatica Data Quality for data quality measurement.

Performed column and rule profiling using IDQ tool (Informatica Analyst).

Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst).

Exported the IDQ Mappings and Mapplets to power center and automated the scheduling process.

Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.

Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.

Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.

Extensively developed Oracle PL/SQL stored procedures using TOAD to implement different business logics and perform backend testing (Oracle 9i) to verify the transactions and business logic.

Created and tested UNIX shell scripts to automate SQL scripts and stored procedures.

Extract data from DB2 database for ETL processing.

Environment: Informatica Power Center 9.6.1/9.5.1/8.6.1, IDQ 8.6.1, Teradata 12/13, DB2, Oracle 10g, SQL Assistant, XML, UNIX Shell Scripting, Control M 7/8.

MBFS - Farmington Hills, MI Oct 2013 –Aug 2014

Role: Senior ETL Developer

Responsibilities:

Responsible for gathering the user requirements and discuss with Business Analysts to acquire the functional and Technical specifications

Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across application and systems.

Used Meta Data Manager to gather data from different applications and flat files for users to do impact analysis of the changes in the source systems.

Worked with Informatica Data Quality toolkit and performed Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.0.

Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using B-TEQ, Fast Load, MultiLoad, and T-PUMP.

Designed, developed, implemented and maintained Informatica Power Center and IDQ 8.6.0 application for matching and merging process.

Unit testing and System Testing of Informatica mappings and Workflows.

Used IDQ tools to Analyze, Cleanse the data and created scorecards.

Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 8.6.0) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.0 for data quality measurement.

Developed ETL code, control files, metadata, and lineage diagrams for ETL programs.

Created and configured Workflows, Worklets, and Sessions to transport the data to target using Informatica Workflow Manager.

Extensively involved in performance tuning at source, target, mapping, session and system levels by analyzing the reject data.

Maintained Development, Test and Production mapping migration using Repository Manager.

Responsible for identifying the bottlenecks and tuning the performance of the Informatica mappings/sessions.

Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.

Extensively performed unit testing and system or integration testing.

Generated PL/SQL scripts and UNIX Shell scripts for automated daily load processes.

Environment: Informatica Power Center 8.6.1/8.1.1,Teradata 12, DB2 UDB, UNIX Scripting KSH Windows 2000, Power Exchange 8.6.1, IDE 8.6.0,IDQ 8.6.0.

AIG - Baskin Ridge, NJ Oct 2012 –Sep 2013

Role: Informatica Developer

Responsibilities:

Worked closely with data modelers on Sybase Power designer for dimensional modeling (Star schema).

Used Informatica 8.6.0 to extract, transform and load data from multiple input sources like flat files, SQL Server into Sybase ASE and IQ database.

Created Mappings, Mapplets and Transformations using the Designer and developed Informatica sessions as per the business requirement.

Worked on Source Analyzer, Warehouse Designer, Mapping and Mapplet Designer and Transformations, Informatica Repository Manager, Workflow Manager and Monitor.

Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata B-TEQ scripts using Teradata Visual Explain utility.

Worked with different types of partition techniques like Key Range, Pass Through, Round Robin and Hash Partitioning.

Performed column and rule profiling using IDQ tools and was involved in data cleaning.

Generated Reusable Transformations, Mapplets and used them extensively in many mappings.

Developed scripts to load the data from source to staging and staging area to target tables using Fast Load, MultiLoad and B-TEQ utilities of Teradata.

Administrated Workflow Manager to run Workflows and Worklets.

Involved in performance tuning on Informatica mappings.

Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.

Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.

Extensively utilized the Informatica data cleansing tool IDQ and Data explorer IDE.

Created Database Triggers, Stored Procedures, Exceptions and used Cursors to perform calculations when retrieving data from the database.

Unit Testing and System Testing of Informatica Mappings and Workflows.

Implemented Error Handling Logic, which involves testing of incorrect input, values for the mappings and the means of handling those errors.

Responsible for the Folder Migration and Folder Creation and creating user accounts in the Repository Manger and management of passwords and permissions.

Used Shell Scripts which compares the incoming file with the existing file and creates a log file with differences and the changed values are loaded into the target.

Environment: Informatica Power Center 8.6.1/8.1.1, Teradata, Sybase ASE, UNIX Scripting KSH Windows 2000, Power Exchange 8.6.1, IDE 8.6.0, IDQ 8.6.0, Autosys.

Mindtree - Hyderabad, India Feb 2010 – Sept 2012

Role: Informatica Developer

Responsibilities:

Analyzed Business Requirements, Analytics and strategies to improve the business

Identified the Facts and Dimensions using Erwin Data modeling tool to represent the Star Schema Data Marts.

Coordinating offshore team, planning and review of deliverables.

Responsible for documenting user requirements and translated requirements into system solutions

Used shell scripting to check if the flat files have come in on the correct path and triggering the informatica jobs from UNIX server.

Evaluating the consistency and integrity of the model and repository.

Designed and built Data Marts for home and business divisions.

Installed and configured Informatica Server and Power Center 7.2.

Worked on Metadata changes to the Informatica repository.

Responsible for Data Import/Export, Data Conversions and Data Cleansing.

Created Informatica mappings with SQL procedures to build business rules to load data.

Used Transformations for data joins, complex aggregations and external procedure calls.

Various kinds of the transformations were used to implement simple and complex business logic. Transformations used are: Remote Procedure, Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregators, Filters and Sequence Generator etc.

Integrated sources from different databases and flat files.

Designed mappings with multiple sources using Informatica Designer tool.

Implemented various performance tuning concepts to enhance the jobs performance.

Logical and Physical Warehouse Designing using Erwin Tool.

Extensively used Star schema for designing the data warehouse.

Designed and Developed the ETL mapping and load strategy documents.

Designed the ETL Mapping Documents for development team members for their implementations.

Worked on Informatica - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.

Designed and developed Informatica mappings for data loads and data cleansing using IDQ.

Designed and developed the workflow and worklets for all load process.

Worked on OBIEE reports to do Enhancements.

Environment: Informatica Power Center 7.2/8.1, Power Exchange 8.6.1, IDE 8.6.0, IDQ 8.6.0, Teradata, TOAD, XML, Windows NT, Erwin, OBIEE.

Infotech - Hyberabad, India Nov 2008 – Jan 2010

Role: ETL Developer

Responsibilities:

Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all application and systems.

Documentation of technical specification, business requirements, functional specifications for the development of Informatica mappings to load data into various tables and defining ETL standards.

Worked on Informatica Power Center 7.1.3 tool - Source Analyzer, warehouse designer, Mapping Designer & Mapplets, and Transformations.

Created and configured Workflows, Worklets, and Sessions to transport the data to target using Informatica Workflow Manager.

Extensively involved in performance tuning at source, target, mapping, session and system levels by analyzing the reject data.

Maintained Development, Test and Production mapping migration using Repository Manager.

Unit testing and System Testing of Informatica mappings and Workflows.

Generated PL/SQL scripts and UNIX Shell scripts for automated daily load processes.

Extensively worked in Oracle SQL Query performance tuning and created DDLs, database objects like Tables, Indexes and Sequences etc., by working closely with DBAs.

Developed several forms and reports in the process. Also converted several standalone procedures/functions in PL/SQL to packaged procedure for code reusability, modularity and control.

Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.

Worked on OBIEE reports to do enhancements.

Tuning of Informatica jobs with oracle as backend database.

Environment: Informatica 7.1.3, Business Objects, Oracle 8.1.7.4, SQL*Plus, PL/SQL, TOAD 7.1, UNIX, Windows XP, OBIEE



Contact this candidate