Post Job Free
Sign in

Data Informatica

Location:
Brampton, ON, Canada
Posted:
September 28, 2020

Contact this candidate

Resume:

VIJAYKUMAR PALANISWAMY

**********.***********@*****.***

306-***-****

Profile

** + years in Data Warehousing, Data Migration, and Data Integration including: -

Administration, Software Development Life Cycle (SDLC) like Waterfall and Agile & CMM process.

Extensively worked on New Installation & Upgrades of Informatica 7.x, 8.x, 9.x,10.x & BDM / EDC 10.1.1,10.2.1,10.2.2 and Axon on Sun Solaris, IBM AIX, Red Hat Linux and Windows Server Platforms and applying patches and Hot fixes.

Experienced in installing Informatica Cloud Secure Agent and perform administration task in IICS org – informatic intelligent cloud services is to execute jobs on demand and migrate data to cloud and on premises

Implement GRID installation with Informatica on a shared file system using VERITAS cluster file system on Oracle RAC & DB2 with High Availability HA.

Implemented Informatica Big data Management – BDM in cloud/AWS and On-Premise in cluster environment using NFS & GPFS on HIVE/BLAZE & SPARK Engine as pushdown.

Installed and configured Power Exchange 9.x (log miner, oracle express) for oracle CDC, DB2 CDC, and DATAMAPS for mainframe Adabas CDC.

Implemented Informatica Data Replicator IDR to synch data in real time.

Upgrade Informatica repositories in oracle 9i, 10g, 11g and DB2 9.1.

Installed/Configured/ trained developers on ILM tool from Informatica for Data masking.

Implemented and maintain Informatica MDM server for Master data management.

Experience in B2B and Performance tuning.

Installed and configured Informatica Web services Hub, Metadata manager / Business Glossary, Reporting services, Data Quality, Data Analysis services,Dash Board and reporting services etc.

Automated the Scheduled maintenance activities like Shutdown/ start-up services.

Created multiplexing environment to support parallel development.

Automated code deployment using Deployment groups and UNIX shell scripts.

Designed Framework in Informatica to capture the job statistics from Informatica repository tables using MX views.

Data Modeling in both Kimball and Immon Style.

Analysis, Design, Development and Implementation of Data warehouse and Data Integration.

ETL Design on Data Extraction, Transformation, and Loading with RDBMS, Complex / Flat files, SAS and XML.

As Architect, build Data models to implement Operational Data store, data warehouse and design ETL processes for data Integration.

Worked in SAS for ETL and Reporting Purpose.

SQL experience using Netezza,Oracle 11g/10g/9i/8i, SQL, PL/SQL, SQL*Loader, Stored Procedures, Functions, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views,IBM DB2 UDB 8.0/7.0 & MS Access.

Extensively used tools of MLoad, BTeq, FastExport and FastLoad to design and develop dataflow paths for loading transforming and maintaining data warehouse

Trained by Informatica in Power Exchange on installation, optimization and development.

Extensive experience in Mainframe using COBOL/DB2 & JCL to integrate data from various source systems into ODS and data hubs.

Strong domain knowledge in Retail, Insurance, Banking and Mining.

Strong experience in training and mentoring teams of up to 12 members, including knowledge transfer.

Attended Informatica World 2013, 2017, 2019.

Education

Bachelor’s Degree in Computer Science Engineering,

Professional Certification

AICPCU:

Certified in General Insurance

INS 21 - Property and Liability Insurance Principles

INS 22 - Personal Insurance

INS 23 - Commercial Insurance

IBM:

IBM Web Sphere IIS DataStage Enterprise Edition v7.5

SAS:

Statistical Analysis Software

A00-211 - Base SAS Certification

A00-212 - Advanced SAS Certification

Professional Training

Informatica

Professional training with Informatica on Power exchange CDC in Oracle, DB2 and Mainframe Adabas

Professional Training on B2B to process unstructured data.

Attended workshop on performance tuning measures @ Informatica Conference 2013.

Attended workshop on Informatica ILM & 9.6.1 features.

Big Data - Hadoop

Perceiving Big Data / AI /Machine learning Course in UFT (university of Toronto)

Technical Profile

Tools:

Power Exchange 8.1,9.0.1,ILM,IDQ,IDR ER Win, Microsoft Office, Visio, ES Spark, Aqua Data Studio, TOAD, SQL Developer, SPUFI, ACAPS V6.0.,easytrieve, Change man, Endevor, Control-M, Autosys, Zena, Tidal,Cognos.

Operating:

UNIX (Sun Solaris, AIX), Red hat Linux, Windows NT/XP, MVS Z/OS.

ETL:

Informatica Power Center 9.6.1/9.5.1/9.1/8.6/8.1/8.0/7.1.2/7.1.1/7.0/6.2/5.1/1.6, Power Mart 6.2, Power exchange CDC, Datastage 7.5.SAP BW

Languages:

SQL, PL/SQL, UNIX Shell scripts, Java, Perl, COBOL, SAS.

Databases:

Hive, Impala, HDFS, Oracle 8i,9i,10g,11g database, POSTGRES, Teradata, DB2 UDB 8.0/7.0, Microsoft SQL Server / Access, Flat Files, VSAM, SAS Files, XML,SAP HANA, IBM Netezza 3 fin and 6 fin appliance.

Industries:

Retail, Insurance, Health Care, Banking.

Employment History

Symcor, Canada (Nov 2018 – Till Date)

As a Financial Institution, they have new EDW build on PostgreSQL and Data Lake build in Hadoop sourcing from various systems using Informatica 10.2 and BDM 10.4 as the ETL server.

The old data warehouse runs on Oracle using Informatica 9.1 for various data integration and transformation needs.

The Processing of unstructured data – Check images thru Data Transformation thru Power center and Load it to Hadoop using BDM.

The EDC 10.4 and Axon 6.3 is used to publish all the business terms.

The TDM is used for preparing test data and masking it to lower environments.

The BDM and IICS is used to migrate all the Data in Teradata to Hortonworks – Hadoop.

As an Informatica Senior ETL Consultant

Populate Materialized views built based on the POSSE frame works.

Multiplexed Informatica to various environments to support parallel development.

Implemented automated code migration using Deployment Groups.

Build Data Model for EDW using Kimball methodologies.

Incorporated LDAP synch to define/modify the user roles in Admin console.

Designed the Extraction, Transformation and Loading (ETL) using Informatica Power Center 10.2.

The Target definitions are PostgreSQL, XML and Flat files.

Created java Transformations to perform custom functions.

Use Deployment Group to promote code to various environments.

The Informatica Power Exchange CDC is used to incorporate the SCD Type 2 from Oracle.

Implemented SCD Type 4.

Creation of reusable Maps, Worklets, source / Target definitions. Post session and Pre Session variable assignments to Workflow variables which are used to pass values between sessions.

Used Parameter file to declare Global Variables and to initialize Persistent values.

Worked on Unix Shell scripting for executing Informatica workflow thru Tidal.

Defined the Daily and weekly Archival mechanism for the Interface data.

Deployment activities are carried by generating XML for every workflow.

GitHub is used as Version controller.

Implemented Informatica Big Data Management – BDM, EDC, Axon cluster in Cloud/AWS & On-Premises with the HA & GRID Feature enabled to execute jobs HIVE/SPARK/BLAZE engine.

Architected the server sizing based on the volume of data handled, ETL design to implement initial and incremental load using HIVE, IMPALA, KAFKA for real time streaming.

Develop Java programs to invoke the jobs created in Informatica web services hub for transactional data and batch processing which gives business control over jobs.

Used Web services to utilize the Rules Engine for process the data as part of the SOA services.

Used the push down optimization to distribute the load to the database to improve performance.

Installed and configured Metadata Manager and business Glossaries 9.6.1

Successfully Migrated Major project from 9.6.1 to 10.4

Installed and configured IDR - Informatica Data replicator for real time data replication.

Implemented DB2 CDC on Mainframe Z/OS to capture data from mainframe in real time and load the target in UDB DB2.

Created support tickets with informatics to resolve bugs in all the Informatica products.

Support BI development using Cognos.

Installed GRID with High Availability using GPFS & NFS file system.

Automated the code deployment process.

Setting ETL standards and frame work for the ETL jobs.

Aviva, Canada (Sep 2013 – Nov 2018)

As a leading Insurance provider, they have new EDW build on Teradata using Informatica 9.5.1 as the ETL server.

The old data warehouse runs on DB2 9.1 using Informatica 9.1 for various data integration and transformation needs.

The IDQ is used for all the data cleansing tasks and address doctor for address validations.

The Metadata manager & Business glossary 10.2 is used to publish all the business terms.

The Master data Management tool is used to integrate data from various systems to provide a single Golden view of the customer information.

The TDM is used for preparing test data and masking it to lower environments.

The BDM and IICS is used to migrate all the Data in Teradata to Cloudera – Hadoop.

As an Informatica Senior ETL Consultant

Implemented Informatica Big Data Management – BDM, EDC, Axon cluster in Cloud/AWS & On-Premises with the HA & GRID Feature enabled to execute jobs HIVE/SPARK/BLAZE engine.

Architected the server sizing based on the volume of data handled, ETL design to implement initial and incremental load using HIVE, IMPALA, KAFKA for real time streaming.

Installed Informatica 9.5.1 Hot fix 3, Power exchange 9.5.1 for DB2 in mainframe, Data maps for Mainframe, DB2 CDC and ORACLE CDC.

Improved the performance by capturing data in real-time for Oracle and DB2.

Develop Java programs to invoke the jobs created in Informatica web services hub for transactional data and batch processing which gives business control over jobs.

Used Web services to utilize the Rules Engine for process the data as part of the SOA services.

Used the push down optimization to distribute the load to the database to improve performance.

Installed Teradata Client on Informatica server to load the data in to Teradata database.

Installed and configured Metadata Manager and business Glossaries 9.6.1

Successfully Migrated Major project from 9.1 to 9.6.1

Installed and configured IDR - Informatica Data replicator for real time data replication.

Implemented DB2 CDC on Mainframe Z/OS to capture data from mainframe in real time and load the target in UDB DB2.

Installed and configured SAS component to read SAS datasets and process in Informatica ETL Serves.

Support BI development using Cognos.

Configured Informatica MQ plug-in to read data from Message Queues.

Used the push down optimization to distribute the load to the database to improve performance.

Installed and configured Informatica Master Data Management MDM.

Created support tickets with informatics to resolve bugs in all the Informatica products.

Installed GRID with High Availability using GPFS & NFS file system.

Automated the code deployment process.

Have done both In-place upgrade and Parallel upgrade for various Informatica versions.

Setting ETL standards and frame work for the ETL jobs.

Federated Cooperatives Limited, Canada (August 2012 – Sep 2013)

Build EDW for Food, Agri and Petroleum, This Warehouse will facilitate the Analyst to perform various slicing and Dicing against Member, Item, Product, Promotion and sales, EDW using Star Schema. In the process of setting MDM as a hub for various Vendor applications to get a single Golden view of the data.

Data Integration: To migrate data from Mainframe to the new JDA supply chain management build on Oracle.

As an Informatica Administrator & Senior ETL Consultant

Installed Informatica 9.5.1 Hot fix 1, Power exchange 9.5.1 for Adabas in mainframe, Data maps for Mainframe, Oracle /DB2 CDC.

Installed Netezza plug-ins for Informatica to load the data in to Netezza database.

Developed scripts to load data into Netezza using nzload into 3 Fin and 6 Fin appliances.

Worked with IBM on the ETL server sizing and Oracle Database server Sizing.

Build Data models for ODS, EDW and MDM.

Improved the performance by capturing data in real-time for Oracle and DB2.

Develop Java programs to invoke the jobs created in Informatica web services hub for transactional data and batch processing which gives business control over jobs.

Used Web services to utilize the Rules Engine for process the data as part of the SOA services.

Used the push down optimization to distribute the load to the database to improve performance.

Created support tickets with informatics to resolve bugs in Power exchange.

Setting ETL standards and frame work for the ETL jobs.

Tufts (November 2010 – August 2012)

Build EDW for Membership and Provider. This Warehouse will facilitate the Analyst to perform various slicing and Dicing against Members and Provider.EDW using Star Schema. The future enhancement is to bring the Claims into the Warehouse. This Warehouse will have various Facts based on the subject area like Member / Provider / Claims.

As an Informatica Administrator & Senior ETL Consultant

Installed Informatica 9.0.1 Hot fix 1, Power exchange 9.0.1 with oracle CDC, Data maps for Mainframe.

Upgraded the Informatica repository from oracle 10g to 11g.

Populate Materialized views built based on the POSSE frame works.

Multiplexed Informatica to various environments to support parallel development.

Installed tool and trained 15 Developers on ILM – Data masking tool from Informatica.

Implemented automated code migration using Deployment Groups.

Build Data Model for TDW using Kimball methodologies.

Incorporated LDAP synch to define/modify the user roles in Admin console.

Designed the Extraction, Transformation and Loading (ETL) using Informatica Power Center 9.1.

The Target definitions are Oracle 10g/11g, XML and Flat files.

Created java Transformations to perform custom functions.

Involved in writing PL/SQL stored procedures for efficient processing and movement of data.

Use Deployment Group to promote code to various environments.

The Informatica Power Exchange CDC is used to incorporate the SCD Type 2.

Implemented SCD Type 4.

Support BI development using Cognos.

Creation of reusable Maps, Worklets, source / Target definitions. Post session and Pre Session variable assignments to Workflow variables which are used to pass values between sessions.

Used Parameter file to declare Global Variables and to initialize Persistent values.

Worked on Unix Shell scripting for executing Informatica workflow thru Tidal.

Defined the Daily and weekly Archival mechanism for the Interface data.

Deployment activities are carried by generating XML for every workflow.

WinCVS is used as Version controller.

American Automobile Association, AZ (Jan. 2009 – Nov. 2010)

Enterprise Billing and Payment solution is to process Insurance (CA Select, AZ Auto), and Membership information for AAA. EBP has various Interfaces and the data has to be migrated from these Interfaces to Exigen System which processes the Billing and Payment information for AAA.The Data will come in various

formats like XML, Oracle Tables, COBOL and Flat Files. The ETL Tool Informatica is used to load the TEMS database which will later used to extract data for Exigen.

As a Informatica Administrator and ETL Lead Consultant

Upgrade Informatica 8.1 to 8.6.

Manage user/Project folders, code migration, source/target imports in shared folders.

Installed Web services hub for the SOA services.

Developed Framework in Informatica to handle the activity of logging the control information.

Designed the tables to store the Batch Event Information for the Dash board.

Designed the Extraction, Transformation and Loading (ETL) using Informatica PowerCenter 8.6.

The Target definitions are Oracle 11g, XML and Flat files.

Involved in writing PL/SQL stored procedures for efficient processing and movement of data.

Defined the table structures and relationship for TEMS Database to store the input data.

The XSD for XML and Copybook for COBOL files are imported using PowerCenter 8.6.

The SOA services are used to log the control information by calling the Webservices for which WSDL has to be imported in PowerCenter 8.6.

Read SAS and VSAM files using PowerExchange 8.1.

Worked as Data Modeler for the Control jobs routine and TEMS Tables.

Creation of reusable Maps, Worklets, source / Target definitions. Post session and Pre Session variable assignments to Workflow variables which are used to pass values between sessions.

Used Parameter file to declare Global Variables and to initialize Persistent values.

Worked on Unix Shell scripting for executing Informatica workflow thru Autosys.

Defined the Daily and weekly Archival mechanism for the Interface data.

Deployment activities are carried by generating XML for every workflow.

Clearcase 7.1 is used as Version controller.

Automated Restart/Recovery mechanism is implemented minimizing Production Support Effort.

Managed a team of 12 members and involved in formulating Design and coding standards as per the organization’s needs.

Independence BlueCross, PA (Sep. 2006 – Dec. 2008)

The STARS application system supports the Marketing Account Reporting (MAR) department's claim analysis and client reporting activities. The STARS application has two major components MARS (Marketing Account Reporting system) and MRDM (Marketing Reporting Datamart). The data to MARS and MRDM are obtained COD –Claims operations Data system built in Mainframe and SAS.

As a Informatica Administrator and Senior ETL Consultant

Installed and maintained Informatica 7.1 and DataStage 7.5.2 servers.

Code migrations across DEV/QA/PROD, regular Server maintenance activities.

Creating Folders and manages user and user group access to objects based on LDAP settings.

ETL System Maintenance involving monitoring, restarting, rescheduling ETL Process, Big Fix in Informatica 7.1.

Designed the ETL process to migrate the data to the new EDW in DataStage 7.5.2.

Convert the ETL code-base from Informatica 7.1 to DataStage 7.5.2(Server & Parallel Extenders).

Import COBOL copybooks for VSAM and Flat files to load data generated from Mainframe.

Architected and developed Fast Load and MLoad scripts in control file, developed BTEQ scripts to process the data in staging server.

Used OCI Stages, change Capture / Apply, Hash files, Transformer, Sort stages.

Modify the Configuration file to improve Pipeline and Partition Parallelism.

Tuned using Partition mechanism like Hash, Round robin, DB2 in the stages to improve the performance.

Eliminated the Mainframe programs to create Flat files from DB2 database and pull data from Mainframe DB2 directly using DB2 stage.

SAS stage is used to read the SAS Files which eliminated the existing process of exporting SAS files to Flat files. Used SAS scripts for reconciliation.

Used SQL Loader for Bulk load operation.

Wrote UNIX scripts and Wrapper scripts to control the workflow execution.

Control-M is used as scheduler for managing the Script execution.

PVCS is used as the version controller.

Code COBOL/DB2 programs in Mainframe and execute them thru JCLs

Trained by SAS on SAS-BI 9.0

Managed a team of 7 members.

ABN AMRO, IL (July 2005- April 2006)

ABN AMRO offers a wide variety of Retail Banking Products to its customers. The Retail banking products consists of Checking and savings Account. The Infopoint Application which handles The Transactions are received at the Branch Teller and processed at the Automated Clearing house. The EDW is built to bring all the financial transactions and Customer information to the Data warehouse on a daily basis. Once the warehouse load is completed the Datamart Load will start.

As a Informatica ETL Consultant

Analyzed the functional specs provided by the architect and created technical specs for mappings.

Designed, developed and enhanced Mappings using Informatica PowerCenter 7.1 to load data from Source systems to ODS and then to Data Mart.

Designed and developed PL/SQL scripts for Data Import/Export in oracle 9i.

Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.

Developed reusable sessions and mappings.

Performance tuning using round robin, hash auto key, Key range partitioning.

Used shell scripts for automating the execution of maps.

Trained the new team members on the Business functionality and transitioned the Application knowledge.

JPMorgan Chase, NY (August 2003 - June 2005)

The Chase Automotive Finance uses the Automated Credit Application processing system (ACAPS), it’s an Automated Financial Systems (AFS) which will process the loan applications received from various resources, does the auto decision, sends the status of the application to dealer, and then books the deal. The Auto Finance warehouse is built to capture the Auto loans into the warehouse and build various marts to satisfy various the Auto loan and Home loan analytical Reports.

As a Informatica ETL Consultant

Prepared design document for ETL process to build new datamarts.

Extensively involved in extraction of data from VSAM and Flat files by writing programs in COBOL and DB2 in Mainframe.

Implemented Materialized Views using PL/SQL for loading extended tables in Oracle 9i.

Developed various Mappings using Aggregator, Joiners, Lookups, Filters, Router and Update strategy.

Optimized various Mappings, Sessions, Sources and Target Databases.

Incorporated repeated functionalities into Worklets.

Developed simple & complex mappings using Informatica to load Dimension & Fact tables as per STAR Schema techniques.

Improved the performance by incorporating complex transformations at the Database Level using PL/SQL Stored Procedures.

Designed and configured sessions with effective caching and logging.

TJ X, Boston (August 2002 – July 2003)

TJX is a largest retailer of apparel and home fashions in the world, operating six divisions – T.J. Maxx, Marshall’s, Home Goods and A.J. Wright in the United States as well as Winners in Canada and T.K. Maxx in Europe. The Purchase Order Warehouse System is built to track the quantity of the products at various stores for future order placement.

As an Informatica ETL Consultant,

Involved in Analysis of ETL functionalities in Mainframe to extract the Business Rules and preparation of Design Documents for development in PowerCenter 6.1.

Implemented documentation standards and other practices to make mapping easier to maintain.

Extracted data from Legacy systems like Flat files and target is DB2.

Involved in preparation of Data Staging area and Staging Tables.

Designed and developed various mappings using Transformations like Aggregator, Filters, Sequence generator, Lookups, etc to build business rules.

Implemented various Data Transformations using Slowly Changing Dimensions

Implemented Performance strategies like partition Parallelism, replaced Lookups with Join based on the Data Volume, Simplified the Mapping by removing unwanted transformation.

Prepared optimized scheduling in to avoid dead lock.

Crystal Connexions, India (August 2000 – July 2002)

Crystal Connexions is a leading Fashion Designer involved in making Crystal works etc for various costumes. The Warehouse is built to store the customer information, Designer Information, Design details and Online Transactions based on the Star Schema Data model.

As an Informatica ETL Consultant,

Involved in the creation of Near Real time data warehouse using Power Center 1.6.Target Database is Oracle 8i.

Created Informatica mappings that sourced from Flat Files and Oracle Database.

Implemented various Data Transformations using Slowly Changing Dimensions.

Debugged and Tested mappings.

Developed PL/SQL, Perl, Shell, SQL and Informatica Scripts as needed.

Created Archival Process to make sure the storage is used at optimum level.

Converted the ETL code-base to PowerCenter 5.1

Used Warehouse Architect to Maintain Data Dictionary.

References

To be supplied upon request.



Contact this candidate