Yashwanth
Sr ETL Developer
**********@*********.***
Professional Summary:
●Around 8 years of focused experience in Information Technology with a strong background in developing BI applications in various verticals such as Health, Insurance, Finance, Banking and Pharmacy.
●Experience in Analysis, Design, and Development of various business applications in different platforms in data warehousing using Informatica 9.x/8.x/7.1, Oracle, Sybase, Teradata, and SQL server.
●Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
●Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Center, B2B Data Transformation, Informatica Data Quality, MDM, SSIS, OBIEE, etc.,
●Experience in Business Intelligence applications design, development and Maintenance of Integration Services (SSIS), SQL Server Reporting Services.
●Extensive knowledge on the SDLC development life cycle. Involved in different phases starting from Requirement Gathering to Production Deployments.
●Extensive knowledge with Relational &dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top-down and bottom-up approach.
●Experience in using Informatica Client Tools - Designer, Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor, Analyst, Developer tools.
●Developed various mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
●Experience in creating High Level Design and Detailed Design in the Design phase.
●Experience in integration of various data sources like Oracle, DB2, MS SQL Server and Flat Files.
● Extensive experience in Requirements gathering, Analysis, Impact analysis, Design, Development and Quality Assurance, Quality Control, Mainframe testing (Unit testing, System testing, Integration testing and supported UAT), QA Testing Implementation, Project management, defect tracking, causal analysis for the defect, reporting project status reports.
●Hands-on experience in SAS Base, SAS Macros, UNIX, SQL and teradata, working with business users daily in solving their queries on technical end.
●Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
●Supported MDM initiatives using Oracle databases and Informatica Data Quality (IDQ).
●Experience in SQL and PL/SQL - Stored Procedures, Triggers, Packages and Functions.
●Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2) loads.
●Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data staging for operational sources using ETL and data mining features for data warehouses.
●Extensively involved in writing the UNIX shell programming.
●Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
●Looking into Infosphere Datastage job issues to find out the root cause of the issues and fix it to ensure that the Infosphere Datastage jobs are run smoothly without any further issues.
●Implemented performance-tuning techniques at application, database and system levels.
●Provided production support to the system and handle production queries/defects.
●Skilled in developing Test Plans, Creating and Executing Test Cases.
●Experience working in Scrum & agile methodology and ability to manage change effectively.
●Experience in replication using golden gate, materialized views.
●Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
●Excellent communication, interpersonal skill and quickly assimilate latest technologies concepts and ideas.
● Create, maintain and monitor TIDAL jobs for QNXT and Unix, SQL, FTP and SFTP operations; respond to requests and escalate when necessary to the appropriate support group
●Create and debug TIDAL jobs on a weekly basis to automate new processes in UNIX, SQL, FTP and SFTP based on business requests or internal Information Technology (IT) requests.
●Worked with various scheduling solutions including Control-M, Autosys, Tidal, Enterprise scheduler.
●Experience in cloud stack such as Amazon AWS.
●Involved in migrating physical Linux/Windows servers to cloud (AWS) and testing.
Technical skills:
ETL Tools/Data Modeling tools
Informatica Power Center, Power Exchange 9.5/9.1/8.6/8.1/7.1, IDQ, MSBI. (Repository Manager, Designer, Server manager, Work Flow Monitor, Work Flow Manager), Erwin, FACTS and Dimension Tables, Physical and Logical Data Star Join Schema Modeling.
Databases
Oracle 11g/10g/9i, Teradata, MS SQL Server, MS Access, SQL, PL/SQL
Tools
Toad,SQL developer, Visio, Teradata SQL Assistant
Languages
SQL, PL/SQL, T-SQL, UNIX, Shell Scripting, Batch Scripting
Operating Systems
UNIX, Windows Server 2008/2003, LINUX.
Job Scheduling
Informatica Scheduler, Tidal Enterprise Scheduler, Control M, CA Autosys
Education:
Bachelor’s in Computer Science.
Professional Experience:
Endo Pharmaceuticals, Malvern, PA Jan 2016 – Present
Role: Sr. ETL Developer
Endo is a leading health provider company in Malvern committed to help Medicare and Medicaid
Beneficiaries. The goal of this project is to enable access to business-determined encounter-related attributes and associated code sets in the user interfaces. The requirement is that data will be obtained directly from the appropriate source systems (EPIC and MPAC) and will be maintained in a single source of truth (HDM) before populating individual data marts
Responsibilities:
Participated in requirement meetings with Business Analysts to understand analytical content needs.
Assisted in documenting these requirements, resolving ambiguities and conflicts, and ensuring requirements are complete.
Works directly with Client Operations, Client Tech team to understand business scope. Does Requirement Analysis, converts business requirements into technical terms. Works with offshore team for coding, unit testing.
Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
Used Oracle Data Integrator (ODI) 12.1.3 to develop processes for extracting, cleansing, transforming, integrating and loading data into Data Warehouses.
Usage tracking integration with OBIEE and Metrics Automation and scheduling.
Extensive ETL experience using DTS/SSIS for Data Extractions, Transformations and Loads.
Design database table structures for transactional and reference data sources.
Expertise in Merging data from various Heterogeneous Data sources, Populating Dimension and Fact tables in Data warehouses and Data Marts, Cleaning and Standardizing data loaded into OLTP and OLAP databases using SSIS.
Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), Bulk Insert, Data Replication, Data Transformation Services (DTS) and SQL Server Integration Services (SSIS)
Created mappings for HL7 files integration projects.
Involved in creating Jobs, SQL Mail Agent, Alerts and Schedule SSIS Packages.
Build the Dimension & Facts tables load process and reporting process using Informatica
Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
Extracted data from various data sources such as Oracle, SQL Server, Flat files, transformed, and loaded into targets using Informatica.
Created users, groups and gave read/write permissions on the respective Folders of repository.
Designed Mappings by including the logic of restart.
Created source and Target Definitions, Reusable transformations, mapplets and worklets.
Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
Configure and design File Gateway accounts utilizing HTTPS, FTPS, SFTP protocols
Design routing channels, monitor file exchange activity, and troubleshoot data errors in File Gateway
Assign permissions for folder path on trading partner mailboxes in Sterling Integrator
Hands on experience on IBM Infosphere CDC for capturing the multiple changes.
Work on converting manual file transfer process to IBM Sterling File Gateway.
Create templates, routing channels and partners in Sterling File Gateway
Worked on IBM Infosphere CDC to capture changes, developed subscriptions to map the source and the target table to write the changed data into the file or the table using the direct connect.
Uses Informatica B2B and Autosys to create jobs to run the workflows.
Exported the workflows in B2B and created ends in B2B to trigger the workflows.
Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
Identified and eliminated duplicates in datasets thorough IDQ 9.6 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
Created profiles and score cards for the users using IDQ.
Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
Defined system trust and validation rules for base object columns.
Performed Data Steward Activities like manual merge/unmerge and uploading data.
Implementing auto merge and manual merge for best version of Truth.
Finally created Data Validation document, Unit Test Case Document, Technical Design document, Migration request document and Knowledge transfer document.
Created complex and reusable macros and SAS data sets, analysis and generated financial graphical report.
Created UNIX shell scripts to handle pre and post session tasks, and to sftp files on to informatica's server and onto database servers.
MFT (managed file transfer) was used for the transmission of the extracted files to the vendor.
Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
Managed Scheduling of Tasks to run any time without any operator intervention.
Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.
Replication and Extracting data and applying on production using Golden Gate.
Managing Capture, data pumps, replicate processes for data replication in Golden gate.
Created a DEV/QA/Test environment and offered training to IT employees to learn how to build jobs, understand best practices and utilize Tidal's futures in their Development.
Experience with Job Alerts, Job Classes, Queues, Job Actions, Variables, Calendars, File Events, and email on-demand jobs are some features deployed in our TIDAL environment.
MDM Developer:
Defined system trust and validation rules for base object columns.
Involved in match rule sets, Match columns for match and Merge process in improving data quality by Batch Analysis.
Implemented the pre-land and land process of loading the dataset into Informatica MDM Hub.
Configured and documented the Informatica MDM Hub to perform loading, cleansing, matching, merging and publication of MDM data.
Executed batch jobs and batch job group using Batch Viewer.
Implemented parallelism for executing batch jobs using Merge Manager.
Performed Data Steward activities like manual merge/unmerge and uploading data.
Implementing auto merge and manual merge for best version of Truth.
Finally created Data Validation document, Unit Test Case Document, Technical Design document, Migration request document and Knowledge transfer document.
Environment: Informatica Power Center 9.5/9.1, IDQ, SQL Server, UNIX, Toad, SQL Developer, SSIS, Putty, SFTP/FTP
DaVita Health, Dallas, TX Dec 2013 – Dec 2015
Role: Sr. ETL Developer
DaVita Health Analytics is a multinational Health Care company. The Company delivers analytic tools, benchmarks, research, and services to the healthcare industry, including hospitals, government agencies, employers, health plans, clinicians, pharmaceutical, biotech and medical device companies.
Responsibilities:
Worked closely with Development managers to evaluate the overall project timeline.
Interacted with the users and making changes to Informatica mappings according to the Business requirements.
Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling (Star Schema modeling, and Snowflake modeling)
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
Worked on designing, creating and deploying SSIS packages for transferring data from flat files, excels spreadsheets and heterogeneous data to and from SQL Server.
Creating SSIS package to load data from Flat file to data warehouse databases using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term Extraction, Aggregate and Pivot Transformation.
Configured SSIS packages using Package configuration wizard to allow packages run on different environments.
Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various user groups.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, ISNULL, ISDATE, TO_DATE, Decode, Substr, Instr, and IIF functions in Expression Transformation.
Responsible for best practices like naming conventions, Performance tuning, and Error Handling
Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
Configuring the Infosphere Datastage 11.5 Suite on various environments and when any issues are encountered, resolve the installation and upgrade issues with extra efforts.
Creating new Infosphere Datastage projects on Infosphere Datastage 11.5.0.1 apart from existing projects on Infosphere Datastage 8.7 which are moved into new version without executable and compiled in Development environmen to be moved into INT and further into other non production and production environments.
Used Router, joiner, stored procedures, look up, filter, source qualifier and update strategy transformations extensively.
Involved in standardization of Data like of changing a reference data set to a new standard.
Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
Used Address validator transformation in IDQ.
Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
Created partitioned tables, partitioned indexes for manageability and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for cleanup and update purposes.
Developed various mapping and tuning using Oracle and SQL*Plus in the ETL process.
Implemented slowly changing dimensions Type 2 using ETL Informatica tool.
Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
Created Use-Case Documents to explain and outline data behavior.
Working with Informatica Developer (IDQ) tool to ensure data quality to the consumers.
Used Address validator transformation for validating various customers address from various countries by using SOAP interface.
Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
MFT (managed file transfer) was used for the transmission of the extracted files to the vendor.
Set up SFTP connections to various vendors for receiving and sending of encrypted data as part of MFT Integration team support.
Perform analysis, design and implementation of batch processing workflows using Cisco Tidal Enterprise Scheduler Monitor daily cycles.
Scheduled batch jobs through Tidal Scheduler or the Unix Batch server to retrieve output files successfully to be sent to requesting parties.
Utilize Tidal Enterprise Scheduler functions to establish job streams with complex dependencies, manage intricate calendar schedules, Tidal agent installations with specific deliverables to 50 plus application teams throughout the environment.
Involved in Unit testing and system integration testing (SIT) of Informatica and MFT projects.
Designed the metadata, workflow, models in Excel with great transparency and used SAS macros to process the excel file and generate ETL, model scoring and cash flow analysis SAS code.
Use PROC SQL/IMPORT to import data from mainframe oracle clinical databases and MS excel sheets.
Involved in deployment of IDQ mappings to application and to different environments.
Defects are logged and change requests are submitted using defects module of Test Director.
Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.
Environment: Informatica Power Center 9.1, IDQ, SAS, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, SSIS, TOAD, MS Excel 2007.
Sutter Health, Walnut Creek, CA Aug 2012 – Nov 2013
Role: Informatica Developer
This project Provisioning and billing management involves in integrating Order Management Systems and Support tracking system defined for User-Agents and Customer Representatives. Users-Agents and Customers Representatives can find the provisioned Order Status for different products. This system integrates with different third party systems for validation; process Management, and Order Completion
Responsibilities:
Visualize a data architecture design from high level to low level, and design performance objects for each level
Troubleshooting database issues related to performance, queries, stored procedure.
Develop and maintain data marts on an enterprise data warehouse to support various defined dashboards such as Imperative for Quality (IQ) program.
Designed and develop data models and database architecture by translating abstract relationships into logical structures.
Proficient in defining Key Performance Metrics (KPIs), facts, dimensions, hierarchies and developing Star and Snow Flake schemas.
Extensively used flat files for Designed, developed complex Informatica mappings using expressions, aggregators, filters, lookup, and stored procedures to ensure movement of the data between various applications.
Developed several forms and reports in the process. Also converted several standalone procedures/functions in PL/SQL to packaged procedure for code reusability, modularity and control.
Designed Complex SSIS ETL packages to load the data from different sources like OLEDB, EXCEL, FLAT Files, RAW Files and DB2 into various Servers and Databases.
Creating and running SSIS Packages for Import and Export of Heterogeneous Data.
Scheduling and running jobs in the background, taking regular backups of the systems and recover data during failures and Created packages in SSIS with error handling.
Scheduled Jobs for executing the stored SSIS packages, which were developed to update the Database on Daily basis.
Worked on SSIS script task, look up transformations and data flow tasks using T-SQL and Visual Basic (VB) scripts
Extracted data from source systems to a DataMart running on Teradata.
Worked in extracting data from legacy Systems such as main frames, oracle to Teradata.
Create and Modify JCL Jobs as required in Test and Production.
Source data analysis and data profiling for data warehouse projects.
Design and implement all stages of data life cycle. Maintain coding and naming standards.
Developed end-to-end ETL processes for Trade Management Data Mart Using Informatica.
Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.
Extensively worked with Slowly Changing Dimensions Type1 and Type2.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Oracle for optimal performance.
Developed reconciliation scripts to validate the data loaded in the tables as part of unit testing.
Prepared SQL Queries to validate the data in both source and target databases.
Prepared scripts to email the records that do not satisfy the business rules (Error Records) to the uploaded business users.
Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, and Reference Data from various source systems using Informatica Data Quality (IDQ) Toolkit. Worked with Address Doctor, different algorithms, Biogram/Jaro/Edit/Hamming/Reverse distance in IDQ.
Used Informatica data quality (IDQ) in cleaning and formatting customer master data.
Built logical data objects (LDO) and developed various mappings, Mapplet/rules using Informatica data quality (IDQ) based on requirements to profile, validate and cleanse the data.
Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica.
Used Informatica B2B Data Exchange to Structured data like XML.
Prepared the UNIX Shell Scripts to process the file uploads, one of the SOURCE for the data that process the uploads into different stages (Landing, Staging and Target tables)
Worked on TOAD and Oracle SQL to develop queries and create procedures.
Knowledge in working with Informatica Power Exchange.
Automated ETLs to apply complex business rules and loaded the data into the core tables from various data sources like excel files, flat files, and SQL Server using Informatica.
Created the mapping specification, workflow specification and operations guide for the Informaticaprojects and MFT run book as part of end user training.
Experience working in agile methodology and ability to manage change effectively.
Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.
Environment: Informatica Power Center 8.6 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica Power Exchange, SQL, SSIS, Oracle 11g, Flat Files, UNIX Shell Scripting,
IT Elegant Stade Pvt. Ltd March 2011 – Jun 2012
Role: DW Engineer
IT Elegant Stade Pvt. Ltd is a multinational banking and financial services holding company, which is headquartered in Chennai, India. This project mainly involves integrating data from multiple source systems, enabling a central view across the enterprise and presenting the organization's information consistently.
Responsibilities:
Implement procedures to maintain, monitor, backup and recovery operations for ETL environment.
Conduct ETL optimization, troubleshooting and debugging.
Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, re-usable transformations.
Written Complex SQL overrides for source qualifier and Lookups in mappings.
Planned, defined and designed data flow processes for data migration to the Data Warehouse using SSIS.
Export or Import data from other data sources like flat files using Import/Export through SSIS.
Created SSIS package to load data from Flat File to Data warehouse using Lookup, Derived Columns, Sort, Aggregate, Pivot Transformation and Slowly Changing Dimension.
Designed and developed validation scripts based on business rules to check the Quality of data loaded into EBS.
Created the Data Flow Diagrams for the full run and the reprocess partial run for the workflows to be created in Informatica taking into point the dependencies using Microsoft Visio.
Implemented best practices in ETL Design and development and ability to load data into highly normalized tables and star schemas.
Designed and developed mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, Web services Consumer, XML Generator Transformations.
Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.
Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
Developed mapping to load the data in slowly changing dimension.
Involved in Design Review, code review, test review, and gave valuable suggestions.
Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2.
Responsible for offshore Code delivery and review process
Used Informatica to extract data from DB2, XML and Flat files to load the data into the Teradata
Prepared SQL Queries to validate the data in both source and target databases.
Extracted data from various data sources transformed and loaded into targets using Informatica.
Involved in creation of Informatica users and repository backup using Server Manager.
Generated reports for end client using various Query tools like Cognos.
Involved in creating report models and applying row level security filters for report model.
Involved, Conducted, participated in process improvement discussions, recommending possible outcomes, and focused on production application stability and enhancements.
Environment: Informatica Power Center 8.6, Oracle 9i, DB2, Sybase, Rapid Sql Server, SSIS, Erwin, UNIX.
Kenexa Systems Pvt. Ltd, Hyderabad, India June 2008- Feb 2011
Role: ETL Developer
Kenexa Systems Pvt. Ltd The backbone Information Technology support to the ‘XL Capital group’ of companies. As part of providing financial solutions, XL Global Services Inc generates various reports for presenting a comprehensive Credit and Risk analysis for its customers. The project was designed to develop and maintain Data Marts. We have to upload the data from various centers with the data in different systems using ETL Tools.
Responsibilities:
●Involved in Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting the data from the multiple systems.
●Logical and Physical data modeling was done for data warehouse database in STAR SCHEMA.
Analyzed the requirements to identify the necessary tables that need to be populated into the staging database.
Prepared the DDL’s for the staging/work tables and coordinated with DBA for creating the development environment and data models.
Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL PERL scripts to automate the process.
Refreshed reports using Scheduler.
Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
Extensively worked with aggregate functions like Avg, Min, Max, First, Last, and Count in the Aggregator Transformation.
Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
Done caching optimization techniques in Aggregator, Lookup, and Joiner transformation.
Extensive hands on Schema design, XML import/export, Scheduling, System management using Data Analyzer.
Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
Used Parameter Files defining variable values for mapping parameters and variables.
Created FTP scripts and Conversion scripts to convert data into flat files to be used for Informatica sessions.
Wrote SQL Scripts and PL/SQL Scripts to extract data from Databases
Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: Informatica Power Center 7.1, Oracle 8i, SQL Server 2000, PL/SQL, SSIS, TOAD, SQL*Plus, SQL*Loader, UNIX.