Ramesh Chinnannavar Cell: 614-***-****/
***********@*****.*** 201-***-****
*** ******** ***, ****** **** NJ 07306
Summary
A PMI Certified Project Management Professional (PMP) with 25 years’ experience in Information Technology and experienced with Informatica Master Data Management, Informatica Metadata Manager, Informatica Data Quality, Power Center, Ab Initio, Data stage, Oracle, Collibra, Teradata and AWS in industry verticals like Finance, Insurance and Health care.
Strong Professional experience in designing, implementing, and managing short and long-term projects in the Information Technology industry. Dynamic, energetic, and effective communicator with experience in managing and motivating multi-disciplined teams and comfortable communicating with senior management and systems professionals at all technical levels.
Extensive Global experience in leading large, complex IT onsite and offshore programs for Fortune companies. Consistent track record of on-time and on-quality delivery.
Responsible for all activities related to the development, implementation, and support of ETL processes for large-scale data warehouses.
Strong experience with data quality, data integration, Informatica MDM hub design, configuration, C360 and implementation.
Strong experience in developing Informatica MDM User Exits, Trust rules, Match and Merge rules, IDD and Hierarchies in Informatica MDM.
Expertise in MDM, IDD, ActiveVOS, PC, IDQ and Address doctor software installations.
Experience with Informatica Metadata Manager and Business Glossary.
Strong experience in Data Governance and developing rules with Informatica Data Quality(IDQ) tools.
Extensive experience with ERWIN, Power Designer, data modeling & data analysis using Dimensional Data Modeling and Relational Data Modeling, Star Schema and Snowflake schema modeling, FACT and Dimensions tables, Physical and Logical Data Modeling.
Database experience using Oracle, Teradata, BTEQ, MultiLoad, FastLoad, SQL, PL/SQL, SQL Loader, Stored Procedures and Functions.
Project Manager with proven track record for providing on-time project delivery and meeting or exceeding all target goals and objectives.
Experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
Experience with Teradata’s Financial Services Logical Data Model.
Experience in providing the architectural solutions and creating the metadata model.
Experienced in all phases of the Systems Life Cycle, including project definition, analysis, design, coding, testing, implementation and support.
Expertise in designing Universes from scratch, migrating to different environments, designing Web Intelligence documents and maintaining them on Business Objects XI R2.
Experience with Agile development - Organized and facilitated Agile and Scrum meetings, which included Sprint Planning, Daily Scrums, Sprint Check-In, Sprint Review and Retrospective.
Experience with Big Data Technologies.
Expertise in UNIX and UNIX shell scripting.
Designed and deployed well-tuned Ab-Initio graph (Generic and Custom) for
ODS and DSS instance both windows and UNIX environments.
Demonstrated experience in developing and implementing formal project management methodologies and establishing standards.
Experience in coordination with offshore development team
Excellent problem solving, communication, team and client management skills.
Certification:
PMI Certified Project Management Professional(PMP)
AWS architect associate training from AWS
Teradata Certified Professional TD12
Teradata Certified Professional V2R5
Teradata Certified Implementation Specialist
Teradata Certified SQL Specialist V2R5
Advanced Teradata Certified Professional V2R5
Teradata Certified Administrator V2R5
Teradata Certified Design Architect V2R5
Certified Scrum Master I Professional
ITIL V3 foundation Certificate
Sun Certified Programmer for the Java 2 platform
Big Data - Hadoop, Hive, Pig, HBase, Yarn, NoSQL, SPARK, Scala, R, Python, Kafka, Data Science and Text analytics badge certifications from Big Data University.
Leading SAFe 5.0 Scaled Agile Certification
Immigration Status:
USA Citizen
Skill sets:
Tools: Siperian Informatica MDM 9.1/9.5/9.6/10/10.1/10.3, Informatica Power Center (8.x, 9.x,10.2), Informatica Designer, Workflow Manager, Workflow Monitor, Informatica Metadata Manager 9.6.1, IDD, C360, AVOS, SIF, IDQ, Informatica Developer 9.6.1/10.2, Informatica Analyst 9.6.1/10.2, Informatica Power Exchange, Data Explorer, Ab Initio, Greenplum, Address doctor, Trillium, Collibra, Oracle MDM and Data stage 7.5/8.0, AWS and BitBucket.
Database Utilities: PL/SQL, Toad 8.0/12.1, Export / Import, SQL Plus and SQL * Loader
Scheduling Tools: Autosys, Maestro, Control -M and UC4
Databases: ORACLE 8/9/10g/11i, MS SQL Server 2000/05, MS Access, Teradata V2R25/12/13, DB2, IMS DB and HBASE
Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 3.5/4.0/7.1/9.0/9.5, Power Designer v16.5/16.6.
Big Data Technologies: Hadoop (HDFS & Map Reduce), PIG, HIVE, HBase, Sqoop, Impala, HiveQL, Kafka, YARN, SCALA, R, SPARK and PYTHON.
Application Server: IBM Websphere 4.2/5.1/6, JBoss 6.4, 7.x
Web Tools: JDBC, JSP, JAV Script, HTML, XML, SOAP, Web Service and VB Script.
Reporting Tools: Business Objects XIR2, 4.0, Crystal Reports, Tableau, Qlikview and Cognos
Defect Tracking Tools: Quality Center/ HP ALM
Operating Systems: Windows XP, MVS ESA, IBM-AIX 4.3.2 and Sun Solaris 2.6/7
Languages:, PERL, SAS, MQ Series, Stored Procedures, Unix shell scripting, C#, Java
Project Process: SDLC, CMMI, RUP and AGILE
Academic Qualification:
MBA - Information Technology from Manipal University, India, 2008.
Master of Engineering with specialization in Applied Electronics from PSG College of Technology, Coimbatore, India, 1993.
Bachelor of Engineering in Electrical and Electronics from Mysore University, Davanagere, India, 1989.
Professional Experience:
MUFG Bank, Jersey City, NJ 08/2017 – Present
VP - Master Data Management Lead
The scope of the project is for Core Banking transformation, Enterprise Data Transformation Pillar 2 Master Data Management planning, analysis, assessment, design and build MDM to replace old legacy RCIF system and establish the modern enterprise MDM solution which will act as source of truth for Party data and Reference data. RCIF system contains customer name, address, customer to customer and customer to account cross reference which is used to aggregate loan and deposit at the customer level.
Responsibilities:
Conducted several meetings for consensus building of current state system, future state, process flows, proposed architecture and roadmap with key technology and business stakeholders.
Provided technical direction and managed the project team members across multiple groups and departments for strategic assessment, current state analysis, target state IT strategy, requirement definitions and program roadmaps.
Involved in Informatica MDM, IDQ and PC product and System integrator scoring and selection.
Conducted interviews with all interfacing systems SMEs in order to understand as-is process, data flows, pain point in system and existing data anomalies.
Worked with 200+ Business IT, System owners and SME's in workshops and various stakeholder meetings.
Interacted with the business users to gather the party data elements from various Party data systems and LOBs.
Reviewed data requirements, business requirements, conceptual, logical, physical data model, functional design document and detail design documents.
Reviewed code and ensured best practices are followed.
Reviewed ETL code to load from source systems to pre-landing, MDM landing tables and outbound interface systems.
Assisted Enterprise Architects for developing the architecture solution diagram.
Involved in planning, design and build MDM, RDM, IDD and services code and deployment.
Interacted with QA team during SIT phase deliverables.
Reviewed match and merge rules, trust, validation rules, queries, packages, hierarchies and IDD.
Configured the ActiveVOS and Provisioning tool for business entities.
Configured JMS queue and real time services(SIF APIs) for outbound interfaces
Conducted POC for Syncsort and Informatica Power exchange to capture real time changes.
Implemented reference data using Informatica RDM.
Performed data encryption using Voltage API.
Mapped KDE's to BIM (Business Information Model) Domains and Sub Domains
Develop Data Quality framework, created DQ rules using Informatica Data Quality(IDQ).
Performed data profiling using Informatica developer and analyst, analyzed DQ results, anomalies and created dashboard. Developed address standardization rules.
Involved in MDM, IDD, ActiveVOS, PC, IDQ and Address doctor software installations and configuration in AWS and on-prem.
Developed re-conciliation approach document to reconcile the data between multiple hops.
Created DevOps process for deploying the code.
Software/Tools Used: Informatica MDM 10.3, IDD, SIF services, Informatica Data Quality(IDQ) 10.3,Informatica Power Center v10.2, Informatica Power Exchange, Informatica RDM, Address doctor, JBoss, ERWin 9.8, Syncsort,,Toad, Oracle, SQL, Colibra, Greenplumb, Jira, Bit Bucket and AWS Cloud
TD Bank, Mount Laurel, NJ 09/2015 – 07/2017
Senior Solution Architect
The scope of the project is to create a centralized data hub for Risk and Regulatory data warehouse. The primary objective of the data warehouse is to provide a single standardized source of truth for data consumption by the business or regulators via reporting and analytics. Source data is pushed from the data lake(EDPP) in the Hadoop ecosystem to the pre-staging layer. It involves the one time initial load and then incremental loads from multiple source systems. The Informatica data quality tool is used to perform the data cleansing and standardization before loading into the MDM staging tables and RRDW.
Responsibilities:
Interact with the business users to understand the requirements.
Developed the timeline and tracked the progress.
Developed project plan, monitor risks and execute risk response plans.
Negotiated with customers to prioritize the resolution of issues.
Involved in customer and delivery management.
Perform problem analysis and issue resolution.
Create the system design specification(SDS) document for the entire program and review with the management
Involved in developing the design documents and review the designs.
Used mapping editor in Power Designer to map between source and target columns
Created logical, physical data Models and created Business Glossary using Informatica Metadata manager.
Developed data quality rules and reviewed the data profiling output and identified the data anomalies.
Standardized the data using reference tables.
Developed stored procedures and proto type SQL and HiveQL
Developed ETL to load the data into MDM Hub and to downstream systems.
Performed data governance using Collibra.
Developed Reference tables using IDQ Analyst tool.
Created trust, match and merge rules in the MDM Hub.
Configured the provisioning tool for business entity services
Developed landing, staging and base object models in MDM hub.
Create source to target mappings for source to data lake(EDPP) Hadoop ecosystem, from EDPP to pre-stage via using sqoop, pre-stage to stage layer and stage and RRDW layer
Configured the IDD application for different subject areas.
Software/Tools Used: Informatica Power Center v10, Informatica MDM 10.1, IDQ, IDD, Power Designer 16.6, Collibra, Oracle, HDFS, HiveQL, Sqoop, TOAD, Tableau, SVN-Sub version, Visio and MS Project.
Kaiser Permanente, Pleasanton, CA 03/2015 – 08/2015
Senior Data Architect
Master Data Management and Hierarchy Configuration
The scope of the project is to build Kaiser Permanente's Customer Master Data and configure hierarchies in Informatica MDM hub. At Kaiser currently hierarchies are being maintained in the excel source. As part of this project we have configured the Department, Account, Location and Job Family hierarchies. We have also configured the Informatica Data Director (IDD) applications to manage and govern these hierarchies by Data Stewards.
Responsibilities:
Involved in MDM v10.0 installation activities using JBOSS application server and informatica platform.
Interacted with business users to understand the business need and prepared the requirement document.
Created the design document to master the data and build hierarchies.
Created Physical Data Objects and Logical Data Objects in Informatica developer.
Developed ETL code using Informatica Developer to bring data from different source systems and load into MDM staging tables.
Created the batch group jobs to load the data from MDM staging tables to Base object tables.
Configured match and merge rules to create the BVT.
Developed cleanse functions to standardize the data before loading into staging tables.
Created the Entity and Relationship data model to build the hierarchies.
Developed the ETL code using Informatica Developer to load the Entity and Relationship tables from the excel source.
Built the MDM hierarchy packages and profiles to determine how the data should be displayed in MDM Hierarchy Manager and IDD.
Built IDD application and configured the subject areas to govern the hierarchical data.
Built basic and advanced queries to view and modify the hierarchical data through IDD.
Built IDD user exists to extend the MDM functionality.
Configured the ActiveVOS to manage tasks, request and reports.
Performed the data profiling using IDQ tool to identify the data anomalies.
Analyzed the profile results, score cards and created the rules
Performed data validation and implemented the DQ rules using mapplet in IDQ
Created the dashboard with rules assigned to the data quality dimensions.
Extensively used Informatica Data Quality transformations – Labeler, Parser, Standardizer, Match, Association, Consolidation, Merge, Address Validator, Case Converter, and Classifier
Developed custom metadata objects and added business glossary in Informatica Metadata Manager.
Created conceptual, logical and physical data models.
Copied data to and from Cluster and native OS. Ingesting data into HDFS using Flume and Sqoop.
Created table and column families in HBase.
Prepared the project plan and successfully delivered on-time, within scope and budget by monitoring and tracking the progress
Performed the risk management to minimize project risks.
Software/Tools Used: Informatica MDM v10.0, IDD, Informatica Analyst 9.6.1, Informatica Developer 9.6.1, Power Center 9.6.1, Hive, HDFS, Red hat Linux, Oracle, Power Designer, JIRA and Visio.
Kaiser Permanente, Pleasanton, CA 09/2014 – 02/2015
Senior Data Architect
ASG Rochade to Informatica Metadata migration
The project involves in creating custom X-Connect to load Teradata Metadata for different Regions in Informatica Metadata Manager. Create the custom model, Populate the custom csv file using Power center. Load the Resources in Metadata Manager and run a lineage analysis
Responsibility:
Gather business requirements to migrate Rochade to Informatica Metadata Manager 9.6.1 for 8 regions.
Architect and designed the Informatica MM solution based on the business requirement.
Prepared the technical design document.
Recommended the architectural solutions and obtained approval from the Management.
Developed Custom Model for 8 different region files to bring in custom metadata into MM warehouse.
Connected to Teradata and DB2 using Packaged Model to bring in Clarity metadata into the MM warehouse.
Developed Power center ETL process to load csv files required for custom x-connect.
Scheduled Custom and Packaged Resource to load the Rochade Clarity metadata to the Informatica MM warehouse.
Created linking rule files and Enumerated link files to links metadata across source system
Created linking between business terms & technical metadata across systems.
Responsible for the management of technical project from initiation through implementation, includes: planning, analysis, design development and implementation.
Provides overall direction for establishing project requirements, priorities and deadlines. Integrates project plans into Program Plans.
Software/Tools Used: Informatica Metadata Manager (MM) 9.6.1, Informatica Power Center 9.6.1, UNIX, Teradata 13, Oracle, Hive, and SQL developer.
Pharmacyclics, Sunnyvale, CA 02/2014 – 08/2014
Senior MDM and Data Integration Architect
The project involves the Customer Master Data Management is to uniquely identify the Parties (Healthcare Providers and Healthcare Organizations). Sales team creates the customer data in Veeva CRM Sales Force application and this will be extracted using Informatica PC and loaded into MDM. The consolidated data in the Informatica MDM is published and send to Veeva and other downstream systems for Analytics reporting.
Responsibility:
Interact with Sales Operations team to gather business requirements and translate to technical requirements.
Create Project Plan, schedule, priorities the tasks
Developed the logical and physical data model for Party data model.
Document and maintain technical requirements, including data flows, data structures, and data definitions.
Managed technical team on solution implementation.
Managed technical operations of Master Data Management and Aggregate Spend systems.
Configured Siperian Informatica MDM, setup Trust rules, match and merge rules.
Loading the various business applications data into landing, staging and base objects MDM hub console.
Created the duplicate data report for data stewards to review.
Used Merge Manager and Data Manager to perform merge and un-merge HCP and HCO data.
Developed customized user exits and performed address cleaning using IDQ.
Involved in MDM operations and informatica mappings enhancement to the existing functionalities.
Laid out a roadmap for Data Governance and MDM to mature the governance program.
Performed code migration using MDM metadata manager.
Support data issues with Veeva sales force application.
Perform data quality using IDQ tool and provide data quality assessment along with recommendation.
Define/design data quality framework which can be leverage at enterprise level by any application to perform data quality.
Created profiles, score card and data quality rules.
Used address doctor to standardize the addresses.
Define operational procedures and metrics to ensure effective and efficient operations.
Developed best practices and guidelines for MDM operations.
Performed data analysis, data fixes and master the data as per the ad-hoc requests.
Define areas for improvement, and manage implementation of improvements.
Software/Tools Used: Informatica MDM 9.6.1, SIF API, Informatica Power Center 9.5, IDQ, Sales force -Veeva, Oracle, Toad, SQL, Unix, Power Designer,, MS Project.
EMC Corporation, Boston. MA 02/2013 – 01/2014
Data Integration Architect
The Enterprise Hub project encompasses the process to operationalize the existing Informatica MDM implementation to enable Customer creation and to manage and govern the Enterprise master data attributes of a customer.
The project will enable the business and end users through:
A new Web Application, designed to leverage the new set of services for Customer creation.
Real Time Data Quality through the use of Information quality services including:
oReal Time addresses validation through Address Doctor, with postal directories for 244 countries.
oReal Time duplication checks with Hub
oReal Time IM matching and improved D&B matches
A high availability infrastructure to improve response time and to enable real time capabilities.
Managing states of Customer through Prospect/Lead/Customer/Partner and between Active and Inactive based on the amount of activity in past time.
Extending the data model to include all Enterprise Master Data attributes.
Responsibility:
Build Informatica mappings to use Address doctor for address cleansing.
Developed Informatica mappings to get data from IDQ database for RSA and SAP data.
Enhance the existing Informatica mappings to receive the data from 11i for all defined set of attributes to Informatica MDM HUB.
Data conversion from Informatica HUB to iMAP.
Created Data Conversation Design document and obtained sign–off from the Client.
Performed code migration from Dev to Test and Performance environment. Check-in, Check-out and apply label.
Adhered to the Informatica MDM Architecture to extract the data from the 11i to the LOAD Inbound staging environment for landing load to the Customer MDM HUB. Also Extract the data from customer MDM HUB and load the data to outbound staging environment. Integrated both batch and real time data.
Reviewed the enterprise conceptual, logical and physical data model for MDM.
Followed guidelines and best practices for building the data model
Developed cleansed functions. Setup trust rules, match and merge rules in MDM.
Developed data quality rules using IDQ.
Support Functional, System testing and ensure defects are fixed on time.
Provided status update and issue resolution to the management.
Coordinated with business, onsite and offshore team.
Software/Tools Used: Informatica MDM 9.5, Informatica IDD, IDQ, Address doctor, Trillium, Informatica PC 9.1, SIF, Oracle, ERwin, SQL Developer, Control-M, UNIX and HP QC.
Nike Inc. Beaverton, OR 11/2011 – 01/2013
Senior Data Architect
Integration Competency Center (ICC).
Responsibility:
Developed high level design and technical specifications for Informatica mappings as per the business requirements.
Developed high level design for Source and Target mappings as per the business requirements.
Created the Informatica data quality architecture framework and obtained sign-off.
Developed conceptual, logical and physical data model for MDM.
Developed Informatica ETL mappings to load the data from external sources into MDM landing, staging and base objects MDM hub.
Developed MDM cleansed functions. Setup trust rules, validation rules, and match and merge rules.
Performed merge and un-merge the data using merge manager and data manager.
Configured Business Data Director. Executed jobs using batch group and batch viewer
Analyzed the root cause of data issues in the Data Mart reports and fixed the ETL mappings.
Developed conceptual, logical and physical data models.
Developed best practices and guidelines.
Define the approach for error handling and reconciliation reports.
Support UAT testing, Deployment and Production stabilization
Software/Tools Used: Informatica MDM 9.0/9.1, Informatica 8.6/9.1, Informatica Metadata Manager, Web Services, BDD, Teradata 13, BO 4.0, Oracle, SQL Developer, Erwin, Autosys and UNIX.
Nationwide Financial, Columbus, OH 10/2008 – 08/2011
Senior Data warehouse Consultant/Architect
Responsibility:
Perform requirement gathering, coding, code review, testing and implementation for Annuity, Life Insurance and Retirement plans.
Used FSLDM for data modeling and developed conceptual, logical and physical data models.
Designing ETL jobs in Data Stage for the EDW which provides data for Business Objects reports.
Customized the Teradata’s FSLDM based on the requirements.
Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
Optimized various Mappings, Sessions, Sources and Target Databases.
Implemented Slowly Changing Dimensions (Type 2: versions) to update the dimensional schema.
Extensively involved in extraction of data from SQL Server, Oracle, Flat files and design of ETL process using Informatica and Data stage.
Implemented a New Project by changing the current DW and prepared detail design document.
Create standards, guidelines for data modeling and ETL processes.
Configured Customer and Products in Oracle MDM.
Interact with end users in a support role to fulfill requests or resolve issues.
Created and reviewed Architectural decisions, architecture solution, performs re-engineering of architectures in order to create solution Blue print to meet project requirement.
Evaluated and reviewed Design Frameworks and Methodologies and approves design in order to achieve functional and non-functional requirements and conformance to the architecture.
Design and recommend changes to data warehouse schemas like Star and Snow flake designs.
Performed data conversion from Oracle to Teradata.
Performed capacity planning, database recovery and archival.
Participate in POC, Architects and validates complex technical solution.
Software/Tools Used: Informatica 9.1, Oracle MDM, Oracle 10g, Teradata, Business Objects, Flat files, Erwin 7x, SQL, PL/SQL, UNIX, Unix Shell Scripting, QC and Windows 2000.
Unisys Global Services, Charlotte, NC 02/2006 – 09/2008
Development Manager
Responsibility:
Worked with Business Analyst and business users to gather requirements and translate the requirement into technical specifications.
Plan, analyze, design, construct, test and implement new data feeds.
Prepare Metrics reports for each member of the team to generate the effort and schedule variance.
Plan, execute, and performed estimation of projects that use Informatica.
Motivated the resources to work as a team and encouraged to ensure high quality of deliverables.
Responsible for the management of technical project from initiation through implementation, includes: planning, analysis, design development and implementation.
Provides overall direction for establishing project requirements, priorities and deadlines. Integrates project plans into Program Plans.
High level design and documentation of Teradata ETL architecture.
Designed and managed physical data model using Erwin for Data Warehouse and Data mart based on user requirements.
Worked on complete SDLC from extraction, Transformation and Loading of data using Informatica and Teradata.
Involved in ETL design for the new requirements and suggested feasible solution, to integrate the core and the new systems.
Developed MLOAD and FLOAD for loading data into Teradata.
Performance Tuning of the ETL Jobs and database.
Data quality checking and interacting with the business analysts.
Software/Tools Used: Informatica Power Center 8.6/8.1, Data Stage, Oracle 10g, SQL Developer, Teradata, Flat files, Erwin 7.1, PL/SQL, UNIX, Unix Shell Scripting, QC and Windows 2000.
TSYS Inc, Columbus, GA 03/2005 – 01/2006
Senior Data Modeler/Tech Lead
Responsibility:
Tasks included conducting requirement analysis, designing and developing of a data model (star schema), the creation an Oracle9i database, coding of ETL procedures (PL/SQL and Informatica), and designing of Cognos Impromptu reports and the automation of production jobs.
Created conceptual, logical and physical data models in 3rd normal form and star schema using Erwin for staging database and for OLAP database.
Created complex mappings for moving the data from different sources into Oracle database and used Informatica tool as ETL tool for constantly moving the data from sources into staging area.
Implemented Slowly Changing Dimensions (Type 2: flag) to update the dimensional schema.
Monitored workflows and collected performance data to maximize the session performance.
Responsible for creating test cases to make sure the data originating from
source is making into target properly in the right format.
Responsible for stress testing of ETL routines to make sure that they don't break on heavy loads.
Involved in business analysis and technical design sessions with and technical staff to develop requirements document, and ETL specifications.
Review existing data availability and quality and prepare detail documentations.
Developed various Mappings using expressions, aggregator, joiners, java, lookup and filters and slowly changing dimensions, lookups, filters, router, update strategy, reusable components, mapplets, session and workflows as per the design.
Data quality checking and interacting with the business analysts.
Design processes that transform and load data sources into fact and dimension tables according to best practices.
Implement database and ETL programs according to design and coding standards. Conduct peer design and code reviews.
Maintain the repositories for different environments like DEV, QA and PROD
Software/Tools Used: Informatica Power Center 7.x/8.1, Power Mart, Teradata, Crystal Reports, Cognos, ERWIN, UNIX, Oracle 9i, PL/SQL, VSS and Windows NT.
American Express, Phoenix, AZ 01/2002 – 02/2005
Lead ETL Specialist
Responsibility:
Responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, administration and mentoring.
Designed Informatica ETL processes and using Source Analyzer & Designer, created Object definitions designed and developed complex mappings using expressions, aggregators, update strategy, filters & functions to ensure data movement between applications.
Helped the team in analyzing the data to be able to identify the data quality issues.
Involved in ETL design for the new requirements and created users/groups and folders using Repository Manager.
Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
Developed Extract, Transform, and Load (ETL) design specifications.
Developed and executed scripts as pre/post-session commands to schedule loads, through SQL-loader utility.
Configured and ran the debugger from within the Mapping