Post Job Free
Sign in

Manager Data

Location:
Windsor Locks, CT
Posted:
September 18, 2016

Contact this candidate

Resume:

Yasoda Ayyankala

Sr. Informatica PowerCenter/IDQ/MDM Lead

Email: ********@*****.***, Skype Id: ******.***@*****.*** Phone: (925) – 9996191

Experience Summary:

Over 10 + years of professional IT experience in Analysis, Design, Development, Testing and implementation of various Data Analytics, Data Integration/Migration, Data Warehouse, Data Architecture/Modelling, Near Real Time Data Integration and Data Quality Projects.

Experience in Banking, Insurance, Manufacturing, Pensions and Automotive domain projects.

Worked on Data Integration and Data Warehouse projects, Informatica Upgrade (from version 8.6 to 9.5.1) from end to end starting from accepting Business requirements, Data Analysis, preparing Data Model, created Functional and Technical Design documents, estimation, development and review.

7+ years of experience in Informatica Power Center (9.x) in creating complex mappings, sessions, workflows, reusable components in Designer, Workflow Manager using various transformations like SQL Transform, Source Qualifier, Unconnected and Connected Lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure etc.,

5+ years of experience as a Informatica MDM developer in MDM Hub Configuration, creation of landing tables, staging tables, base objects.

Configuring Trust and Validation Rules, Match and Merge setup, Data Manager, Merge Manager, hierarchies, foreign-key relationships, lookups, queries and packages, IDD Configuration and SOAP configuration.

5+ years of experience in Informatica Data Quality, IDQ 9.6/9.1 Data Profiling standards and best practices on Process Sequence, Data Quality Lifecycles, Naming Convention and Version Control.

Experienced in creating complex quality rules, designed, developed and implemented patterns with cleanse, parse, standardization, validation with ETL and creating reference tables.

Experienced in creating Cleanse functions, Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.

Experienced in coding SQL Queries (Analytic Functions, SQL joins, Nested Queries, Unions, Multi-table joins), PL/SQL, Procedures and Functions

Experienced writing UNIX shell scripts/commands and modifying the shell scripts as per the business requirements.

Exposure to Informatica Power Exchange for CDC, SAP NetWeaver and other connectors.

Worked on Dimensional modelling in creating Star and Snowflake Schemas and created slowly changing (SCD) Type I/II/III dimension mappings.

Strong knowledge in Extraction, Transformation and Loading of data from heterogeneous source systems like Flat files, COBOL files, VSAM files, XML Files, CSV Files, Oracle, Siebel, DB2 and SQL SERVER.

Expensive experience in Production support process in loading data, monitoring the Batch jobs, tracking and resolving the incidents and provided solutions on time to the application users.

Experienced in finding the root cause for Batch Job failures, reviewing Logs, Health Checks, raising the incidents, resolving the issues by coordinating with the respective teams.

Experience in scheduling the batch jobs using Workflow Monitor, Control M and Approx. third party tools.

Experienced in working Agile Methodology.

Good experience in performing/supporting Unit testing, System Integration Testing (SIT), UAT testing for all the Power Center and Developer objects.

Hands on experience in identifying bottlenecks in various levels like Sources, mappings and sessions and tuning them. Hands on experience in optimizing SQL scripts and improving performance of databases.

3+ years of experience in designing, coding Mainframes COBOL, JCL programs in reading and writing data to the VSAM files to IBM DB2 tables and vice versa

Experience in preparation of various project related Documentation (high-level & low-level Functional/Technical design documents, Unit Test plans, Migration Documents, Rollback Plan documents, System Performance strategy plans etc.,)

Collaborated with onsite teams and managed various offshore teams.

Well experienced in coordinating different functional groups (DBAs, System Admins, Business teams, QA and Support teams) in providing the solutions for the issues/tickets raised on time.

Excellent communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high adaptability to new technologies and tools.

Technical Skill Summary:

Informatica Tools : Informatica Power center 9.x/8.x, Informatica MDM (9.x/10.x), Informatica

Data Quality (9.x), Informatica Power Exchange 8.6

Databases : Oracle 9i/10g/11g, IBM DB2, MS SQL Server, Siebel

Operating System : Windows NT/2003/XP/7, UNIX/LINUX

Languages : COBOL, JCL, VSAM

Products/Tools : TOAD, Test Director, Appworx, SQL Server, SQL Plus, Serena Change Manager (PVCS), Control-M, SQL Developer

Achievements:

Received STAR TEAM AWARD in Aug 2015, for the best performance in making Deliveries efficiently in Wipro Technologies.

Got Spot Award for 2 times for playing efficient role in the project in the months May ’09 and December ’12 in Mphasis.

Received Key Employee Reward Plan for the best performance in the Team in September ’09 in Mphasis.

Education:

Bachelors in Electronics and Communications Engineering from JNTU, India in Year 2003.

Professional Experience:

Mass Mutual Financial Group, MA Jun 2016 to Till Date

Role: Sr. Informatica MDM Lead

Project: Data Aggregartion

Environment: Informatica Power Center (9.1), Informatica MDM, Teradata, UNIX Shell/Perl Scripts, Scheduling Tool – Maestro,

Working with Mass Mutual Life Insurance. Involved in Integrating the data to the Data Ware house. Added new Admin systems along with different Line of Business products.

Responsibilities:

Doing Impact Analysis and preparing High Level and Low Level design documents based on the Business Requirement documents and providing proper estimates for the whole life cycle of the Data Warehousing project.

Integrating different Source data to the Teradata Data Warehouse and from there loading the data to the Broadridge for Reporting purpose and to use in a Front end to provide the Customer Support.

Involved in Designing, Developing, Unit testing and System Integration testing of Mappings, Workflows, Sessions to achieve the desired functionality and involved in monitoring and administration of all the components for a Data warehousing project.

Working with different types of Source files like COBOL Files, XML Files, CSV files, Teradata Table Views etc.,

Experienced in Hub Configuration and creating Mappings for Stage Jobs in MDM Hub Console.

Configuring Landing Tables, Staging Tables, Base Objects, Lookup.

Setting up Trust and Validation Rules, Match and Merge Setup, IDD configuration, Data Manager, Merge Manager, setting hierarchies, relationships.

Running the Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.

Identified and eliminated duplicates in Source Feeds using MDM base objects.

Reviewing the Code and resolving any technical issues of project.

Experience in writing Complex SQL Queries (using Joins, Analytic Functions, Sub Queries etc PL/SQL Stored Procedures, Functions, Packages.

Ensuring all the Performance Optimization techniques were incorporated while developing the Mappings.

Expertise in performing optimization analysis. Improved the performance of long running jobs by tuning the SQL queries and creating necessary indexes.

Ensuring that all deliverables will be done with high quality on time by following all the processes set by the customer and providing weekly status to the customer.

CREDIT SUISSE, Wipro Technologies, India Jan 2016 to Jun 2016

Role: Sr. Informatica Lead

Project: Informatica Power Center Upgrade (from version 8.6 to 9.5.1)

Environment: Informatica Power Center (9.5.1), MS SQL Server, UNIX Shell/Perl Scripts, Scheduling Tool - Control-M,

Credit Suisse is a banking project which will transform all the files to make them compatible to feed to Intellimatch, a reconciliation tool. It deals with Core (Cash and Stock) and Non Core (CMS, BMS) for all the 3 regions, Singapore (APAC), EMEA (London) and America’s (New York). Currently dealing with Informatica Upgrade project from version 8.6 to 9.5.1 for all the above 3 different regions.

Responsibilities:

Doing Impact Analysis, identifying the change objects and providing proper estimates for Informatica Power Center Upgrade from Informatica 8.6 version to Informatica 9.5.1 version.

Installing the Informatica 9.5.1 tool, setting up the server setup for DEV, UAT and PROD environments to upgrade Informatica Power Center from version 8.6 to 9.5.1.

Ensuring that all Repositories, Folders, Relational connections and odbc.ini files are properly migrated from version 8.6 to 9.5.1.

Performing Unit Testing, SIT Testing, UAT testing of all the Power Center objects, Oracle tables, UNIX scripts in all the DEV, UAT environments.

Created deployment groups for production migration (mappings, workflows, parameter files, SQL scripts and UNIX scripts and supported post-production support.

Doing end to end testing in all the 3(DEV, UAT, PROD) environments to ensure that proper upgrade is done.

Creating Migration documents, Roll back Plans, Release plan and Production support documents.

Weekly updates to the Customer on the deliverables of the project and documenting everything.

AVIVA Life Insurance, Wipro Technologies, India Mar 2014 to Dec 2015

Role: Sr. Informatica MDM LEAD/Architect

Project: TARDIS Integration Application

Environment: Informatica Power Center (9.6.1), Informatica MDM, Informatica Data Quality (IDQ 9.6), Oracle SQL Developer, UNIX Shell/Perl Scripts, Scheduling Tool - GOMD, JIRA Ticketing Tool

AVIVA is an Insurance project, in which Wipro deals with Life Insurance part. TARDIS (Transparent Actuarial Database Insight System) holds the AVIVA Life Insurance Actuarial Data. AVIVA has different source feeds like P1L70, Administrator BPA, Admin DC, Paymaster and Unisure. These feeds will be loaded first into Staging, PAD (Persistent Actuarial Data store – Data Warehouse) and then loaded to MPF files, Prophet Results and then finally into FSA and ASPPFM data marts.

Responsibilities:

Doing Impact Analysis and preparing High Level and Low Level design documents based on the Business Requirement documents and providing proper estimates for the whole life cycle of the Data Warehousing project.

Experience in creation and maintenance of Customer and Product Domain solutions using Informatica MDM.

Involved in Designing, Developing, Unit testing and System Integration testing of Mappings, Workflows, Sessions to achieve the desired functionality and involved in monitoring and administration of all the components for a Data warehousing project.

Understanding the Source Data, Profiling the Data Sources and creating Score cards using IDQ Developer and Analyst Tool.

Involved in Data analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.

Designing, developing and unit testing of Power Centre Workflows, Sessions, Mappings, IDQ mappings and running the Sessions to achieve the Data Integration for near Real Time.

Experienced in creating tasks in the workflow manager and exported the IDQ mappings to the Power Center.

Experienced in Hub Configuration and creating Mappings for Stage Jobs in MDM Hub Console.

Configuring Landing Tables, Staging Tables, Base Objects, Lookup.

Setting up Trust and Validation Rules, Match and Merge Setup, IDD configuration, Data Manager, Merge Manager, setting hierarchies, relationships.

Running the Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.

Identified and eliminated duplicates in Source Feeds using MDM base objects.

Reviewing the Code and resolving any technical issues of project.

Experience in writing Complex SQL Queries (using Joins, Analytic Functions, Sub Queries etc PL/SQL Stored Procedures, Functions, Packages.

Ensuring all the Performance Optimization techniques were incorporated while developing the Mappings.

Expertise in performing optimization analysis. Improved the performance of long running jobs by tuning the SQL queries and creating necessary indexes.

Ensuring that all deliverables will be done with high quality on time by following all the processes set by the customer and providing weekly status to the customer.

Mentored/Trained team members in creating different Design documents and other required documentation.

MAZDA USA Siebel, Wipro Technologies, India Aug 2013 to Mar 2014

Role: Sr. Informatica ETL/IDQ Developer

Project: Data Migration/ Data Integration for Mazda

Environment: Informatica Power Center (9.6.1), Informatica Data Quality(IDQ), Oracle SQL Developer, UNIX Shell Scripts, Scheduling Tool – Tidel

MAZDA is an Automobile manufacturing company. MAZDA USA Siebel currently using Ephiphany system to provide Customer Support for all its customers. Now MAZDA wants to use Siebel Database to provide customer support. So whatever data currently the Ephiphany system is holding need to be migrated to Siebel Database and also ensure that future data also loaded to Siebel Data base.

Responsibilities:

Analysing the requirements, preparing Physical solution design document based on Logical solution design document and technical specs based on the functional specs.

Responsible to ensure that the goals of the business with regards to Data Migration and Data Integration are met.

Profiling the Data Sources and creating Score cards using Developer and Analyst Tool.

Involved in Data analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.

Designing, developing and unit testing of Power Centre Workflows, Sessions, Mappings, IDQ mappings and running the Sessions to achieve the Data Integration for near Real Time.

Experienced in creating tasks in the workflow manager and exported the IDQ mappings and executed them using the scheduling tool, Control-M

Analysis, development, QA implementation, monitoring and administration of the ELT process.

Preparing test cases and test data for Unit testing based on the technical specs.

Migration of code to various environments of SIT, UAT and Production.

Expertise in performing optimization analysis. Improved the performance of long running jobs by tuning the SQL queries and creating necessary indexes.

Symantec, Mphasis, India Jul 2009 to Aug 2013

Role: Informatica ETL/MDM Developer.

Project: EPayment Application

Environment: Informatica 8.6/9.x, Informatica IDQ 9.6 Oracle 7.x/8/x/10g, UNIX, Informatica MDM, Appworx, Serena Change Management

Symantec helps consumers and organizations secure and manage their information-driven world and get it provides security, storage and systems management solutions to help customers secure and manage their information-driven world against more risks at more points, more efficiently than any other company.

Customer is purchasing any product from Symantec those orders going to store in ERP12 system, from ERP to CODS transfer the data through Share flex. So CODS is my source, transfer the entire data to EDW (Enterprise Data Warehouse) through Informatica. We use to maintain complete historical data. This data is useful to the customer for weekly reports, monthly reports based for their DSS.

Responsibilities:

Analysing the requirements, preparing Physical solution design document based on Logical solution design document and technical specs based on the functional specs.

Designing, developing and unit testing of Informatica mappings, mapplets, sessions, workflows and worklets to achieve the desired functionality using Power Center.

Involved in Data Profiling, Data analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.

Designing, developing and unit testing of Power Center Workflows, Sessions, Mappings, IDQ mappings and running the Sessions to achieve the desired functionality.

Experienced in creating tasks in the workflow manager and exported the IDQ mappings to Power Center and executed them using the Workflows.

Extract CDC data using Informatica Power Center.

Experienced in Hub Configuration and creating Mappings for Stage Jobs in MDM Hub Console.

Configuring Landing Tables, Staging Tables, Base Objects, Lookup.

Setting up Trust and Validation Rules, Match and Merge Setup, IDD configuration, Data Manager, Merge Manager, setting hierarchies, relationships.

Running the Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings.

Improved the performance of long running jobs by tuning the SQL queries and creating necessary indexes.

Department for Work and Pensions (DWP), Mphasis, India, UK Jul 2007 to July 2009

Role: Sr. Software Engineer.

Project: Social Fund and Computer Systems (SFCS)

Environment: COBOL, JCL, DB2, VSAM, FILE-AID, CICS and IDMS.

Department for Work and Pensions (DWP) is a UK Government Project. It deals with various Fund systems like SFCS, JSAPS - ESA, CA, DLA, ISCS, etc...

The Social Fund Computer System (SFCS) supports the processing and payment of Social Fund awards. SFCS exists to facilitate the payment of the discretionary awards BL, CL, CCG, and the regulatory awards MP and FP facilitate the recovery of BL’s, CL’s and FP’s.

The scope of SFCS is to bestow awareness in the reader of all functions within SFCS and allowing impact analysis at a high level for any amendments or new functionality.

Responsibilities:

Understanding SFCS application and mapping requirements to the Functional and Technical changes for the Application.

Designing Functional and Technical documents and involved in all phases of the SDLC cycle.

Estimating, Distribution of Work among the team, doing reviews and coordinating with Onsite and the customer identifying the requirements and resolving the issues.

Setting up the Weekly Status calls, updating the status of the work and completing the work on time.

Overseeing the Quality procedures related to the project and performed quality audits to ensure standards, procedures and methodologies are being followed.

To coordinate with System testing team to identify testing requirements and identifying the environmental setup required and executing the Cases.

Coding and performing Link testing in EIT services.

Review of Best Shore design and code.

Allstate Insurance, FSA – IMAX, Syntel, India Dec 2006 to May 2007

Role: Analyst Programmer

Project: Insurance Files Processing.

Environment: COBOL, JCL, EZtrieve, File-Aid, DB2, Oracle, VSAM, Test Director

Allstate Insurance is the largest publicly held personal lines Insurance Company in the United States. It has 3 Products Auto, Home and Business Insurance. We deal with Auto and Home Insurance.

Auto Insurance - Its primary use is to provide protection against losses incurred as a result of traffic accidents. An insurance company may declare a vehicle totally destroyed ('totaled' or 'a write-off') if it appears replacement would be cheaper than repair.

Home Insurance - Home insurance, or home owner’s insurance, is an insurance policy that combines various personal insurance protections which can include losses occurring to one's home, its contents, loss of its use (additional living expenses), loss of other personal possessions of the homeowner, as well as liability insurance for accidents that may happen at the home.

Responsibilities:

Understanding the requirements and mapping requirements to Functional and Technical changes in the Application. Coding as per the design documents.

Preparing Test Cases, executing the cases and logging the defects using Quality Centre tool, Test Director.

Overseeing the Quality procedures related to the project and performed quality audits to ensure standards, procedures and methodologies are being followed.

Performed System and Integration testing and review of the Test cases and its execution.

DCX ITM Rite Src ARC’s, Syntel, India Mar 2006 – Dec 2006

Role : Analyst Programmer

Project: MOPAR–Finance (RA-II) –Canadian Integration

Environment: COBOL, JCL, DB2, IMSDB, VSAM, File-Aid, Expeditor, Endeavor

Daimler Chrysler is a major automobile and truck manufacturer and financial services provider (through DaimlerChrysler Financial Services).

Canadian Integration is one of the applications where Canadian Finance system is integrated to MOPAR Finance/ Accounting system.

Canadian Integration: Analysis of Sales Revenue and Receivables portion of Canadian Finance Systems to eliminate redundancy and integrate into the MOPAR Finance / Accounting System.

Responsibilities:

Understanding the requirements and mapping requirements to Functional and Technical changes in the Application.

Doing Impact analysis on the requirements and creating Design documents.

Coding as per the Design documents.

Preparing Test Cases, executing the cases and logging the defects using Quality Centre Tool, Test Director.

Reviewing code and Test cases execution.

American Express (AMEX), Syntel, India Sep 2005 - Feb 2006

Role: Analyst Programmer

Project: Amex Pay Store Rev Engineering S3P2 Environment

Environment: AS400, Cool:Plex, MSVisio

American Express is one of the world's largest travel agencies, and is as well-known for its famous charge Cards and revolving credit cards.

This is a Reverse Engineering Process in which firstly it was developed in AS400, Cool:Plex. Now they want to change it to Java- Mainframe Technology.

Responsibilities:

Understanding the requirements, preparing Flow charts as per the AS400 code and prepare the Design documents and reviewing.

Created flow charts in MS Visio.



Contact this candidate