Post Job Free

Resume

Sign in

INFORMATICA DEVELOPER

Location:
Jonesboro, AR
Posted:
February 08, 2021

Contact this candidate

Resume:

Kishore M

Phone: 214-***-****

Mail: adj0un@r.postjobfree.com

PROFESSIONAL SUMMARY

8+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQL Server databases, SSIS, Informatica MDM (Version 10.3, 10.2), Informatica Power Centre 9.x/8.x/7.x, Informatica Data Quality, etc

Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions.

Experience in installation and configuration of core InformaticaMDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server, and Cleanse Adapter in Windows.

Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor

Extensively worked with complex mapping using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.

Manage and administer JIRA/Confluence/Bitbucket add-ons, plugins, and extensions.

Strong understanding of SDLC processes and the QA lifecycle methodology

Provide configuration update support and functionality checking within the development cycle of ServiceNow.

Educate developers on how to commit their work and how can they make use of the CI/CD pipelines that are in place.

Setup full CI/CD pipelines so that each commit a developer makes will go through standard process of software lifecycle and gets tested well enough before it can make it to the production.

Experience with Snowflake Virtual Warehouses.

Created complex mappings in Talend using components like tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow, tDBinput, tFTPconnection, tFTPget, tsendmail, tflowmetercatcher, tjavarow, tFilelist, tFilecopy, tFTPclose etc. and have created various complex mappings.

Advance Knowledge, excellent concepts and experienced in decision support system (DSS), querying and analysis, Talend and Data Warehousing.

Participates in the development improvement and maintenance of snowflake database applications.

Evaluate Snowflake Design considerations for any change in the application Build the Logical and Physical data model for snowflake as per the changes required Define roles, privileges required to access different database objects.

Experience in building Snow pipe.

Experience in using Snowflake Clone and Time Travel.

In-Depth understanding of Snowflake Multi-cluster Size and Credit Usage

In - depth understanding of Snowflake cloud technology.

In-depth knowledge of. Snowflake Database, Schema and Table structures. Built several reusable.

Expertise in Data warehousing concepts like OLTP/OLAP System Study, Analysis, and E-R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional data modeling.

components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.

Understanding of Monitoring Services (Process, Task, Server) Schedule Active VOS Automatic Processes

Have experience in installation of Active VOS Server in WebLogic Experience in the Health Care and/or insurance industry is required. Strong written and verbal collaboration skills

Expertized in using JIRA software with Jenkins and GitHub for real time bug tracking and issue management.

Educate developers on how to commit their work and how can they make use of the CI/CD pipelines that are in place.

Helped individual teams to set up their repositories in bit bucket and maintain their code and help them setting up jobs which can make use of CI/CD environment. Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.

Worked on integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.

Proficient in Data warehouse ETL activities using SQL, PL/SQL, PRO*C, SQL*LOADER, C, Data structures using C, Unix scripting, Python scripting and Perl scripting.

Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions

Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle

Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments.

Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).

Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.

Experience in complex quality rule and index design, development and implementation patterns with cleanse, parse, standardization, validation, scorecard, exception, notification and reporting with ETL and Real-Time consideration

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files and other sources like Salesforce, etc.

Experience in Data profiling and Scorecard preparation by using Informatica Analyst.

Experience in Migrating Data from Legacy systems to Oracle database using SQL*Loader.

Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts, Linux Scripts, and scheduling tool (Control-M v7/v8), CA WA Workstation (ESP)

TECHNICAL SKILLS

Operating System

UNIX, Linux, Windows

Programming and Scripting

C, C++, Java, Python, .Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.

Specialist Applications & Software

Informatica MDM (10.3,10.2,10.1), Informatica Power Centre/10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, etc, Talend.

Data Modelling (working knowledge)

Relational Modelling, Dimensional Modelling (Star Schema, Snowflake, FACT, Dimensions), Physical, Logical Data Modelling, and ER Diagrams.

Databases tools

SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling tools

Informatica Scheduler, CA Scheduler (Autosys), ESP, Maestro, Control-M.

Conversion/Transformation tools

Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology

Agile, Waterfall.

Domain Expertise

Publishing, Insurance/Finance, HealthCare

Others (working knowledge on some)

OBIEE RPD creation, OBIEE, ECM, Informatica Data Transformation XMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD.

RDBMS

SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008, MS Access 7.0/2000.

Cloud Technologies

Snowflake, SnowSQL Snow pipe AWS.

Data Warehousing

Snowflake, Redshift, Teradata

EDUCATION DETAILS

Bachelor’s Degree in computer science. Passed out from JNTU Hyderabad.

PROFESSIONAL EXPERIENCE

Client: Broadridge Financial Solutions Feb 2019- Till Date

Role: Role: Informatica Developer

Responsibilities:

Manage and administer JIRA/Confluence/Bitbucket add-ons, plugins, and extensions.

Strong understanding of SDLC processes and the QA lifecycle methodology

Provide configuration update support and functionality checking within the development cycle of ServiceNow.

Educate developers on how to commit their work and how can they make use of the CI/CD pipelines that are in place.

Setup full CI/CD pipelines so that each commit a developer makes will go through standard process of software lifecycle and gets tested well enough before it can make it to the production.

Experience with Snowflake Virtual Warehouses Migrated servers from 1and1 to AWS Elastic Cloud (EC2), databases (RDS).

Experience in building Snow pipe.

Implement Snowflake database security changes as per requirements.

Creation of groups, roles & process IDs in snowflake.

Experience in migrating data into Snowflake on AWS using Talend.

Grant access to snowflake existing user accounts, Implement Users groups, roles & process ID's to users. Support connection issues as per request from Application users.

Procedures creation (created automated proc for user creation/roles/grants)

Creation of User defined Function creation in Snowflake

Expertized in architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse. Actively involved in migrating the data warehouse from on-prem to Snowflake

components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.

Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects.

Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.

Extensively worked on Labeler, Parser, Key Generator, Match, Merge, and Consolidation transformations to identify the duplicate records.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2

Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, and Aggregator Transformation.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.

Designing and coding the automated balancing process for the feeds that goes out from data warehouse.

Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.

Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (workflows)

Performed tuning of queries, targets, sources, mappings, and sessions.

Used Linux scripts and necessary Test Plans to ensure the successful execution of the data loading process.

Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.

Used EME extensively for version control involved in code reviews, performance tuning strategies at ab initio and Database level.

Prepared the error handling document to maintain the error handling process and created test cases for the mappings developed then created the integration testing document.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings.

Provided 24x7 production supports for business users and documented problems and solutions for running the workflows.

Environment: Snowflake, Snowsql, Teradata SQL Assist, Informatica MDM 10.3 HF1, Informatica IDQ 10.2 HF1, SQL, UNIX, IDE, Talend, Oracle 12g, CDC, Mamajuana, Linux, Perl, AWS, WINSCP, Shell, PL/SQL, Netezza, Teradata, Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio

Client: Bayview Asset Management, LLC Miami, FL Dec2017-Jan 2019

Role: Sr. Informatica Developer/MDM Consultant

Responsibilities:

Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.

Communicated with business customers to discuss the issues and requirements.

Designed, documented, and configured the Informatica MDM Hub to support loading, cleansing of data.

Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.

Setup full CI/CD pipelines so that each commit a developer makes will go through standard process of software lifecycle and gets tested well enough before it can make it to the production.

Experience with Snowflake Virtual Warehouses.

Experience in building Snow pipe.

In-depth knowledge of. Snowflake Database, Schema and Table structures. Built several reusable Migrated servers from 1and1 to AWS Elastic Cloud (EC2), databases (RDS).

Build the Logical and Physical data model for snowflake as per the changes required.

Refactored Java ETL code to provide several new features such as redundancy, error handling, automation, image manipulation (SCALR), and the addition of the AWS Java SDK to handle the transfer of files to S3.

Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.

Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.

Used relational SQL wherever possible to minimize the data transfer over the network.

Identified and validated the Critical Data Elements in IDQ.

Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.

Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects.

Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, and Aggregator Transformation.

Provided support and quality validation through test cases for all stages of Unit and integration testing.

Created, Deployed & Scheduled jobs in Tidal scheduler for integration, User acceptance testing and Production region.

Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.

Used the Teradata fast load utilities to load data into tables.

Used SQL tools like TOAD to run SQL queries and validate the data.

Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements.

Worked on maintaining the master data using InformaticaMDM.

Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (workflows)

Performed tuning of queries, targets, sources, mappings, and sessions.

Used Linux scripts and necessary Test Plans to ensure the successful execution of the data loading process.

Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.

Used EME extensively for version control involved in code reviews, performance tuning strategies at ab initio and Database level.

Prepared the error handling document to maintain the error handling process and created test cases for the mappings developed then created the integration testing document.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings.

Provided 24x7 production supports for business users and documented problems and solutions for running the workflows.

Environment: Snowflake, Redshift, SQL server, Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE, CDC, MDM, Java, Linux, Perl, AWS, WINSCP, Shell, PL/SQL, Netezza, Teradata, Collibra, Microsoft SQL Server 2008, and Microsoft Visual studio.

Client: Charter Communications, Charlotte, NC Feb 2017-Dec2017

Role: Sr. Informatica Developer / Analyst

Responsibilities:

Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.

Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using InformaticaMDM Hub console.

Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.

Involved in extracting the data from the Flat Files and Relational databases into staging area.

Created Stored Procedures for data transformation purpose.

Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management

Worked on Informatica Power Center 9 x tools – Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet& Transformations.

Used Informatica power center and Data quality transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse, and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.

Created and configured workflows, worklets & Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.

Experience in Active VOS workflow design, creation of Human task Extensive experience in MDM 10. x with IDQ and Active VOS 9. 2.

Experience in managing Active VOS Central Setup Identify Service on Active VOS Console

Good Knowledge on Active VOS Error and Fault handing, Event Handling, Gateway, Control Flow

Understanding of Monitoring Services (Process, Task, Server) Schedule Active VOS Automatic Processes

Have experience in installation of Active VOS Server in WebLogic Experience in the Health Care and/or insurance industry is required. Strong written and verbal collaboration skills.

Worked on Database migration from Teradata legacy system to Netezza and Hadoop.

Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).

Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.

Worked with different scheduling tools like Tidal, Tivoli, Control M, Autosys.

Created Tivoli Maestro jobs to schedule Informatica Workflows.

Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ.

Created profile and scorecards to review data quality.

Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading into MDM landing tables.

Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.

Worked on Autosys as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings.

Generated PL/SQL and Shell scripts for scheduling periodic load processes.

Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.

Invoked Informatica using "paced" utility from the UNIX script.

Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.

Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Tivoli.

Involved in Production support activities like batch monitoring process in UNIX.

Prepared Unit test case documents

Environment: Informatica Power Center 9.6.1, UNIX, Oracle, Linux, Perl, Shell, MDM, IDQ, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.

Client: Allstate Insurance, Southampton, PA Mar 2016 - Feb 2017

Role: Sr. ETL Developer / Analyst

Responsibilities:

Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue...

Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues.

Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes.

Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.

Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.

Created Informatica workflows and IDQ mappings for - Batch and Real Time.

Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.

Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.

Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality.

Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter, and Union in developing the mappings to migrate the data from source to target.

Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from relational and Flat Files. Used Update Strategy transformation extensively with DD_INSERT, DD_UPDATE, DD_REJECT, and DD_DELETE.

Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.

Involved in doing Unit Testing, Integration Testing, and Data Validation.

Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.

Developed Unix script to sftp, archive, cleanse and process many flat files.

Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager.

Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.

Created Mapping parameters and Variables and written parameter files.

Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).

Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool (Autosys).

Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.

Worked with the SCM code management tool to move the code to Production.

Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.

Environment: Informatica Power Centre 9.0.1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.

Client: MVP Health Care, Schenectady, NY Nov2014 – Mar 2016

Role: Sr. Informatica Developer/IDQ/MDM

Responsibilities:

Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.

Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.

Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.

Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables.

Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.

Wrote Shell Scripts for Data loading and DDL Scripts.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.

Designing and coding the automated balancing process for the feeds that goes out from data warehouse.

Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.

Hands on experience on HIPPA Transactions like 270, 271, 272, 273, 274, 275, 276, 277, 834, 835, 837 etc.

Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.

All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.

Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.

Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.

Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.

Mapped client processes/databases/data sources/reporting software to HPE’s XIX X12 processing systems (BizTalk/Visual Studio/Oracle SQL/MS SQL/C#/.Net/WSDL/SOAP/Rest/API/XML/XSLT).

Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.

Analyzing the impact and required changes to incorporate the standards in the existing data warehousing design.

Following the PDLC process to move the code across the environments though proper approvals and source control environments.

Source control using SCM.

Environment: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0.

Client: Verizon, Indianapolis, IN Sep 2011-Sep2014

Role: ETL Developer / Analyst

Responsibilities:

Involved in the requirement definition and analysis support for Data warehouse efforts.

Documented and translated user requirements into system solutions, developed implementation plan and schedule.

Designed fact and dimension tables for Star Schema to develop the Data warehouse.

Extracted the data from Teradata, SQL Server, Oracle, Files, and Access into Data warehouse.

Created dimensions and facts in physical data model using ERWIN tool.

Used Informatica Designer to create complex mappings using different transformations to move data to a Data Warehouse.

Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Look up, Aggregator, Stored Procedure, Update Strategy, Joiner, Filter.

Scheduling the sessions to extract, transform and load data into warehouse database on Business requirements.

Loaded the flat files data using Informatica to data warehouse.

Created Global Repository, Groups, Users assigned privileges Using Repository manager.

Setting up Batches and sessions to schedule the loads at required frequency using Power Center Server Manager.

Handled common data warehousing problems like tracking dimension change using SCD type2 mapping.

Used e-mail task for on success and on-failure notification.

Used decision task for running different tasks in the same workflow.

Assisted team member with their various Informatica needs.

Developed and maintained technical documentation regarding the extract, transformation, and load process.

Responsible for the development of system test plans, test case creation, monitoring progress of specific testing activities against plan, and successfully completing testing activities within the requisite project timeframes.

Environment: Informatica Power Center 8.1, Erwin, Oracle 9i, UNIX, Sybase, MS SQL Server Windows 2000.



Contact this candidate