Post Job Free

Resume

Sign in

sr.informatica developer

Location:
United States
Posted:
May 03, 2016

Contact this candidate

Resume:

Ravi Reddy

Email: acumed@r.postjobfree.com Mobile: 469-***-****

Informatica Power Center Informatica transformations OLAP/OLTP Concepts

ETL/data integration Star and Snowflake Schema Performance tuning

SQL Oracle Control-M and Informatica Scheduler Slowly Changing Dimensions

SSIS, SSRS Query & PL-SQL Data modeling

Informatica Data Quality (IDQ) Informatica Performance Tuning UNIX Command

PROFESSIONAL SUMMARY:

●6 years of IT Experience in requirement analysis, design, development and implementation of data warehousing and data marts projects using ETL tool Informatica

●Having work experience in Data Quality (Data Cleansing & Conversion) and System Integration, Data Migration through ETL flows and have good exposure in the ETL and Data Quality Processes using Informatica Data Quality(IDQ) and Informatica Powercenter tools

●Experience in of ETL/data integration experience in using Informatica Power Center 9.x/8.x/7.x

●Experience in extracting data from source systems like Oracle, SQL server, Teradata and MS access and non-relational sources like flat files.

●Experience in working with business analysts to identify and understand requirements and translate into ETL Requirement Documents during the Requirement Analysis phase.

●Used Change Data Capture (CDC) Techniques like slowly changing target (type1, type2, and type3), slowly growing targets, and Simple pass through mapping using Power Center.

●Well versed with SQL*Loader, Packages, Triggers, PL/SQL Development and Tuning Stored Procedures.

●Hands on experience in interacting with the Clients and gathering the requirements for the modules and Enhancements.

●Experience in creating High Level Design and Detailed Design documentation in the Design phase.

●Good knowledge in full life cycle of ETL be it Informatica Power Center (Repository Manager, Mapping Designer, Workflow Manager & Workflow Monitor)

●Expertise in using multiple Informatica transformations like Source Qualifier, Expression, Filter, Router, Rank, Sorter, Aggregator, Joiner, Look up, Update strategy, Sequence Generator etc.

●Extensively worked on Power Center client: Repository Manager, Mapplet Designer, Mapping Designer, Workflow Manager and Monitor to extract, transform and load data

●Efficient in troubleshooting mapping bottlenecks, performance tuning and debugging. Identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

●Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

● Worked on Slowly Changing Dimensions (SCD's) and implemented Type1, Type 2 (Flag and time stamp)

●Used Maestro and Informatica Scheduler for scheduling batch cycle

●Extensively worked on debugging Informatica mappings, mapplets, sessions and workflows.

●Good understanding of Third normal form, Star and Snowflake Schema, Dimensional Modeling, Data Marts

●Knowledge on Teradata Utility scripts like Fast Load, MultiLoad, and BTEQ to load data from various source systems to Teradata.

●Successfully implemented Single Sign On Solution for Oracle RMS and other enterprise applications using Oracle IDM Suite (OAM OIM, PIM and API Gateway)

●Successful implementation of medium to large scale Oracle Applications including database (Oracle RMS, Oracle ERP, Oracle SOA, BPM and IDM Suite) in HA and Cluster Infrastructure implementations

●Hands on detailed expertise in Master Data Management (MDM) and legacy data migrations with ERP domain. I have adapted various methodologies like the Agile (Sprint), RUP, Waterfall in the areas of Project Planning, Functional Analysis & Documentation, Customer Master Data Migrations (MDM), Legacy to Active Conversions, GAP Analysis etc.

●experienced in tech oriented modules BW/BI, LSMW, Workflow, ILM-Archiving, IXOS (Open Text), ALE, Security, SOLMAN, CHARM, HANA, etc.

●Experience in Core banking application such as Flex cube 12.0.1, ACBS 6.0, SWIFT, PEGA payments and customers, East Net, Fed link, Actimize, SSB, Business object reports and BI reports.

●Implemented Canada branch operations for Money Market, Foreign exchange, Funds transfer (Single currency and cross currency) modules and Swift payments.

●Supported Business Objects reporting team by creating BO universe according to the end user requirements

●Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.

●Experience in post-production support, job monitoring using scheduling tools and incident management tools for incidents (Service Now).

●Expertise in documenting the ETL process, status reports and meeting minutes.

●Experience in creating Reusable Transformations to facilitate rapid development efforts.

●Experienced in UNIX work environment, file transfers SFTP, job scheduling and error handling.

●Extensive functional and technical exposure. Experience working on high-visibility projects

●Excellent analytical/ communication skills and a good team player.

●An Oracle/Data Warehouse Architect/Data Architect/Data Modeler/Enterprise Data Architect & Oracle DBA with rich experience with maintain Databases of different sizes.

●Recently provided Data Architect, Database Re-Engineering, Scalability/Performance contract/consulting services on IBM ZOS, ZVM/VSE, and p-Series/AIX platforms.

●Skilled in logical and physical database design. Experienced in logical data modeling including business rules.

●Worked on EDW/MDM Building Data marts Erwin, Erwin Model Mart in multi-user environment with for normalized and dimensional data structure.

●Implemented and maintained web and distributed Enterprise applications using JavaScript, HTML, CSS, JSP, REST, JSON, JQuery, WCAG, and AJAX that follows W3C Web Standards

TECHNICAL EXPERIENCE:

ETL Tools

Informatica power center 9.x/8.x

Databases

Oracle, SQL Server and Teradata

Database Tools & Utilities

SQL*PLUS, SQL developer, SSMS, Teradata SQL Assistant, MLOAD, FLOAD, BTEQ

Other Tools

MS-Office, putty, Archiving/ILM

Operating systems

UNIX, Linux, Windows

Programming Skills

UNIX Shell scripting, SQL, SQL*PLUS, PL/SQL

Scheduling & ERP Tool

Maestro, Informatica Scheduler, Client – Citrix

PROFESSIONAL EXPERIENCE:

PROJECT - I

Sr. Informatica Developer

Walt Disney World, Orlando, FL Jan’14-Till date

Walt Disney World is a largest entertainment company with vast variety of theme parks and resorts. They have 27-resort hotels in USA spanning from Orlando east coast to all the way into Hawaii. They also have Disney cruise line which is one of the leading cruise line company in the world.

Project Description:

D3 (Disney Data Difference) is a Disney’s enterprise data warehouse which is built to serve various analytical and reporting applications across the company such as resort reservation, merchandising, theme park attendance, DCL. I was involved in converting the resort reservation (LILO) application from Python based ETL to Informatica ETL tool and also integration of DCL data into D3.

Tools used: Oracle-11g,Teradada, Oracle, SQL, Informatica 9.6,UC4,UNIX

Responsibilities:

●Analysis the requirement and design the document

●Analyze current Python based ETL application which is written using BTEQs to understand the ETL transformations.

●Based on the source to target mappings document provided by business analysts and analysis of current application, designed ETL transformation to rewrite in Informatica

●Integrated the data from various data sources such as Oracle, DB2 and flat files

●Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.

●Extensively used Sequence Generator in all mappings and fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Check out) on an urgency through support for QA in component unit testing and validation.

●Used shortcuts for sources, targets, transformations, mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

●Applied slowly changing Dimensions Type I and Type II on business requirements.

●Effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation

●Fine-tuned ETL processes by considering mapping and session performance issues.

●Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.

●Extensively worked on performance tuning and also in isolating header and footer in single file.

●Working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client).

●Informatica power exchange is used to replicate the data from source database to staging.

●Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.

●Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.

●Involved in requirement gathering, Design, testing, project coordination and migration.

●Project planning and scoping, facilitating meetings for project phases, deliverables, escalations and approval. Ensure adherence to SDLC and project plan.

●Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD.

●Perform analysis profiling on existing data and identify root causes for data inaccuracies, Impact Analysis and recommendation of Data Quality.

●Precisely documented mappings to ETL Technical Specification document for all stages for future reference.

●Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.

●Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.

●Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

●Deploy the code in different environment like ST,SIT,UAT & Prod

●Maintained the proper communication between other teams and client.

Environment: SQL, PL/SQL, UNIX, Shell Scripting, HP Quality Centre 10, Informatica Power Center 9.x,

Control-M.

PROJECT - II

Sr. Informatica ELT Developer

Coloplast, Minneapolis, MN (Health Care) Mar’13-Dec’14

Project Description:

Coolest Informatica & Siebel analytics Upgradation:

Coloplast develops products and services that make life easier for people with very personal and private medical conditions. Working closely with the people who use our products, Coloplast create solutions that are sensitive to their special needs. We call this intimate healthcare. Coloplast business includes ostomy care, urology and continence care, and wound and skin care. This is an Up gradation and data conversion project where we have migrated our data ware house from OBIA 7.8.4 to OBIA 7.9.6.3 and Informatica 7.x to Informatica9.0.1x.

Responsibilities:

Developed complex mappings in Informatica to load the data from source tables using different transformations like Source Qualifier, look up (connected and unconnected), Expression, Aggregate, Update Strategy, stored procedure, Joiner, Xml, Filter, Sorter and Router.

Installation of Informatica 9.0.1 hotfix 2 environments and 7.8.4 contents back up.

Sand box environment created for regression/Load testing.

Informatica 7.8.4 Repository Backup and restore.

Unit test case creation along with test data creation for testing the code

Folder migration from old repository to upgraded repository.

OBIEE Reports creation and user prompts creation.

DAC Scheduler monitoring.

End to end ware house upgrade from Siebel analytics 7.8.4 to OBIA apps 7.9.6.3

Work assignment to the offshore team and regular status monitoring.

Data migration to upgraded tables from obsolete tables. Using vanilla UPG Informatica repository.

Configuring Infa_sequence_generator.bat file during data migration

Unit test case preparation and data validation.

Retro fixing Informatica mappings in order to replace obsolete tables

Data ware house full load using DAC scheduler, load monitoring and fixing issues if any.

SCD type mappings creation using different transformations.

Worked on Data Masking from hiding the original data by encrypting, using alias names at the object level

Data analysis from SIEBEL source to ware house to BI front end.

Unit testing, surface testing, integration testing and sandbox testing.

Dry run activities before go live and support activities.

Developed Mapplets to implement business rules that involved complex logic.

Tuned the mappings and sessions for better performance by eliminating various performance bottlenecks.

Environment: Informatica Power Center 9.0.1, SQL, PL/SQL, Toad, MS SQL 2008, DAC 10.1.3.4.1, OBIA apps 7.9.6.3

PROJECT - III

Informatica Developer/SME

CAPITALONE, Richmond, VA Mar’12-Jan’13

Capital One is a diversified financial services company offering a broad array of credit, savings and loan products to customers in the United States, UK, and Canada. Capital One is a financial holding company whose principal subsidiaries are Capital One Bank, Capital One, F.S.B., Capital One Auto Finance Inc., and Hibernia National Bank. Capital One offers a variety of consumer lending and deposit products, including credit cards, auto loans, small business loans, home equity loans, installment loans, and savings products. Capital One has emerged as one of the America's largest consumer franchises with almost 50 million customer accounts and one of the nation's most recognized brands Ranked # 206 among the Fortune 500 companies with a global customer base of 48.9 million.

Project Description:

Capital One - IPS DDE (Integrated Production Support Data Distribution Environment) Platform. DDE is the Enterprise Integration Layer (EIL) Platform for the integration of services to support bulk data movement from sources to targets. The DDE enables a comprehensive, standard, and consistent approach to Extract, Transform, and Load (ETL) processing while fulfilling the additional ETL requirements defined for the Capital One analytical and operational environments.

Responsibilities:

●Created complex mapping using transformations like, SQLT, Update strategy, Joiner, Lookups, Sequence generator, reusable sessions and mapplets.

●Created mapping, session, workflow parameters and variables. Designed intraday process incremental load using mapping variable.

●Implemented various Transformations: Joiner, sorter, Aggregate, Expression, Lookup, Filter, Update Strategy and Router.

●Developed and Scheduled jobs for daily loads that invoked Informatica workflows & sessions associated with the mappings.

●Modified the existing mappings for the updated logic and better performance.

●Avoided Duplicate issues through SQL override using look up, Joiner transformations.

●Performed Unit Testing on the mappings based on HLD, LLD and codes developed and developed plans based on pending transaction strategies on credit cards.

●Tested Informatica mappings and workflows, PL/SQL procedures and worked extensively with Informatica for data sourcing, data transformation and data loading.

●Providing SME (Subject Matter Expert) support for Capital One Enterprise Data ware house Application.

●Responding to all Business Queries. Standardizing application production support across all of Capital One business applications.

●It deals with the Enhancement/Maintenance of the applications used by the banking clients.

●Institutionalizing the processes that are continuously improved and optimized by process action teams.

●Preparing response strategies for quick resolution of breakdowns.

●Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems.

●Used Error Handling to capture error data in tables for handling nulls, analysis and error remediation Process.

●Involved in massive data cleansing and data profiling of the production data load.

●Defect management, bug tracking and bug Reporting using Quality Center.

●Performing System maintenance activities, maintain all application related activities etc.

Environment: SQL, PL/SQL, Teradata, UNIX, Shell Scripting, HP Quality Center 10, Informatica Power Center 9.x

PROJECT - III

ETL Informatica Developer

CAPITAL ONE, Chennai, INDIA Jan’10-Feb’12

Project Description:

Capital One - IPS DDE stands for Integrated Production Support Data Distribution Environment. It is one of the services provided by Capital one to provide continuous and predictable business application performance that consistently meets the expectations of the customer. DDE is the Enterprise Integration Layer (EIL) Platform for the integration of services to support bulk data movement from sources to targets. The DDE enables a comprehensive, standard, and consistent approach to Extract, Transform, and Load (ETL) processing while fulfilling the additional ETL requirements defined for the Capital One analytical and operational environments.

Responsibilities:

●Involved in requirement gathering, Design, testing, project coordination and migration.

●Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

●Created mappings using transformations such as source qualifier, aggregator, expression, lookup, and router, filter and update strategy.

●Implemented Slowly Changing Dimensions- Type I & II in different mappings as per the requirement.

●Scheduled and Run Extraction and Load process and monitor workflows using workflow monitor.

●Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.

●Maintain the unit test cases and system testing on the mappings, sessions and finally observe the execution of workflows from the workflow monitor.

●Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.

●Understanding the Business requirements, Source system and as part of understand the different logical models, then get the knowledge of the Extract, Transformation, Loading specifications of the project.

●Used ETL to standardize data from various sources and load into data stage area, which was in oracle and stage to different Data Marts.

●Extensively used ETL to load data from various source like Oracle, flat files and SQL Server to Target warehouse Database on Oracle DB.

●Error log design, data load strategy, unit and system testing, system migration and job schedules.

●Fine-tuned ETL processes by considering mapping and session performance issues.

Environment: SQL, PL/SQL, Teradata, UNIX, Shell Scripting, HP Quality Center 10, Informatica Power Center 8.x



Contact this candidate