Resume

Sign in

Data Manager

Location:
Frisco, Texas, United States
Posted:
January 30, 2018

Contact this candidate

Resume:

Kishore M

682-***-****

Email: ac39kn@r.postjobfree.com

Summary

Over 8+ years of experience in Information Technology including Data warehousing, RDBMS, Client Server applications.

Over six years of data warehouse experience using Informatica Power Exchange, Informatica Power Center 9.x/8.x/7.x and ETL Concepts.

Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.

Involved in creating Dimensional Data Modeling, Fact table and Dimensional tables;

Experienced in OLTP/OLAP system study, analysis and ER modeling.

Experienced in developing Database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling.

Extensive experience in extraction, transformation and loading of data directly from various data sources like flat files, Excel, SQL Server, Oracle, DB2.

Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.

Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x

Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.

Well versed in OLTP Data Modeling, Data warehousing, Datamart and OLAP concepts.

Experienced with complex mappings from varied transformation logics like Unconnected and Connected lookups, Router, Aggregator, Joiner, Update Strategy and re-usable transformations.

Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.

Working experience in using Oracle 11g/10g/9i/8i, SQL Server 2008/2005/2003/2000, PL/SQL, SQL Loader, TOAD, ERWIN.

A good understanding and experience with Business Objects Reports.

Experience on the Oracle Data integrator, to make sure the data the right is loaded.

Experience in using different kind of Partitions for the better performance and fast retrieval of data from the Database.

Extensively used Partition By Range, Partition By Hash Function, Sub Partition, List Partitions.

Extensive experienced in writing stored procedures (PL/SQL), triggers, Functions and Packages.

Experienced in using the Informatica command line utilities like PMCMD to control workflows in non-

windows environments.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Extensive use of Persistent cache to reduce session Processing time.

Experienced in creating Jobs, Alerts using SQL Mail Agent. Well versed in various high availability solutions like clustering, mirroring, log shipping.

Experienced in devising Disaster recovery strategies and effectively testing them. Experience in managing security, creating security policies and rules.

Moved Data from external source into Oracle Database using SQL Loader

Experience of Bulk Insert, BCP utilities and scripts for data transforming and loading.

Experienced in implementing different types of Replication models like Snapshot, Merge and Transactional.

Experience of designing strategies to maintain audit tables, load balance, exception handling and high data volumes.

Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user training and Production support

Experience with software development life cycle (SDLC) and project management methodologies.

An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, and high level of motivation.

TECHNICAL SKILLS:

Languages

C++, Shell Scripting (K-Shell, C-Shell), PL/SQL, PERL, FORTRAN, JAVA (Eclipse IDE and Net Beans IDE), HTML.

Databases

Oracle11g/10g/9i/8i/7x, MS SQL Server 2008/2005/2003/2000, Sybase12.0/11.x/10, DB2,UDB 7.2, My SQL 5.0/4.1,

MS Access 2003/2000/97/7.0. Editors ( SQL Navigator, Toad)

Operating Systems

UNIX, HP, Sun Solaris 2.x/7/8, Windows 7/XP/2000/98,Win NT, IBM AIX, Exceed (Editor)

ETL

Informatica Power center 9.x/8.x/7.x/6.x/5.x. Power Exchange 8.x/7.x, Metadata Manager, Informatica PowerMart 7x/6x

Web Technologies/ Tools

XML, XSL, XHTML, HTML, CSS, JavaScript, VBScript

Data Modeling

ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-Inmon Methodology, Star Schema, Snow Flake Schema, Extended Star Schema, Physical And Logical Modeling, Dimension Data Modeling, Fact Tables, Dimension Tables, Normalization, Denormalization

Reporting Tools

Cognos 8.4/7.1/6.0 (Impromptu, Power Play, Transformer), MS SQL Server Reporting services 2005, Business Objects XI, Crystal Reports 10, Crystal Reports 2008, Oracle Reports 2.5.

EDUCATION:

Bachelors of Technology from Jawaharlal Nehru Technical University, INDIA.

Vizient, Inc Irving,Texas Jan17 - Present

Role: Informatica Developer

Vizient, Inc. operates a network of not-for-profit health care organizations to improve performance and efficiency in Clinical, financial, and operational management. It offers analytics, contracting, consulting, and network development services to help members and customers achieve their strategic objectives. The company also provides supply chain management services for non-acute care market as well as government, education, and

business sectors; revenue management services; and online direct marketing services for local contracting. It serves not-for-profit and non-acute health care organizations, and health system members and affiliates in the

united states.

Responsibilities:

Extracts data from Flat Files and Oracle databases and applies business logic to load them in the central Oracle database.

Develops map, reusable objects, transformation, and Mapplets using Mapping Designer, transformation developer and Mapplets Designer in Informatica PowerCenter 9.1.

Experienced in OLTP/OLAP system study, analysis and ER modeling, developing database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling

Upgrades from Informatica version 9.1 to 9.6.1

Creates reusable transformations and Mapplets and uses them in mappings.

Uses Informatica PowerCenter 9.1 for extraction, loading and transformation (ETL) of data in the data warehouse.

Works with data modelers to prepare logical and physical data models and adds and deletes necessary fields using Erwin.

Implements Informatica recommendations, methodologies and best practices.

Implementations are populated slowly to change dimension to maintain current information and history information in dimension tables.

Uses Informatica PowerCenter Workflow manager to create sessions and batches to run with logic embedded in the mappings.

Involved in performing data validation for various applications such as Salesforce, Annuities and Investments.

Creates folders, users, repositories, and deployment group using Repository Manager.

Works with different data sources such as Oracle, SQL Server, and Flat Files.

Creates complex mappings in Power Center Designer using aggregate, expression, filter, and sequence generator, update strategy, union, lookup, joiner, XML source qualifier and stored procedure transformations.

Uses transformations such as router, aggregator, normalizer, joiner, expression, lookup, update strategy and sequence generator, and procedure.

Creates SSIS packages to migrate slow changing dimensions.

Develops PL/SQL and UNIX Shell/Perl Scripts to scheduled sessions in Informatica.

Creates e-mail notification tasks using post-session scripts.

Works with command line program PMCMD to interact with server to start/stop sessions and batches, stop Informatica server and recover the sessions.

Writes SQL, PL/SQL, stored procedures, triggers and cursors to implement business rules and transformations.

Creates procedures to drop and recreate indexes in the target data warehouse before and after sessions.

Creates deployment groups and migrates code into different environments.

Involved in the data migration from Control-M to TWS scheduler.

Involved in Migration of several ETL applications from Control-m to Tivoli workload scheduler.

Writes documentation to describe program development, logic, coding, testing, changes and corrections.

Manages UNIX servers and defines file systems and directory structures on the UNIX box for various parameters to decide the disk space and memory requirements.

Provides support to develop the entire warehouse architecture and plan the ETL process.

Re-designed ETL mappings to improve data quality and performance.

Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.

Documents technical design documents based on the BRD and SRS.

Environment: Informatica Power center 9., SQL Server 2008, Oracle 11g/10g, DB2 PL/SQL, Power exchange, TOAD, Windows XP ERWIN 4.2, Service Now, UNIX, XML, TWS, Control-M.

American International Group, NJ

Informatica Developer: July14 - Dec16

AIG’s Enterprise Data Warehouse was developed to support the organization’s growing needs for BI reports. The Data Warehouse combines disparate data sources to the users with multidimensional cubes which can be sliced and diced to view the information quickly and efficiently.

Responsibilities:

Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate data from the source to the target DW.

Analyzed the business requirement document and created functional requirement document mapping for all the business requirements.

Extensive use of Datamart and OLAP concepts.

Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.

Developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Java, Java Expressions, Router, Filter, Aggregator and Sequence Generator transformations.

Automated the load process using UNIX shell scripts.

Involved in the data migration from Control-M to TWS scheduler

Involved in performing data validation for various applications such as Salesforce, Annuities and Investments

Used the features like Workload designer, Viewpoint manager etc

Used the features like Server manager, repository server etc

Used heterogeneous files, flat files, spreadsheets, and oracle as sources

Created, monitored and troubleshooted sessions for daily extracts

Involved in tuning the SQL statements to optimize the performance

Disabled the jobs scheduled through Control-M after two weeks of successful runs in TWS.

Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.

Created reusable objects in Informatica for easy maintainability and reusability.

Performed the data validations and control checks to ensure the data integrity and consistency.

Extensively used debugger to trace errors in the mapping.

Extensively involved in coding of the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures

Involved in developing test plans and test scripts to test the data based on the business requirements.

Created source, target, transformations, sessions, batches and defined schedules for the sessions.

Re-designed ETL mappings to improve data quality.

Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, and filter.

Created Workflows and used various tasks like Email, Timer, Scheduler, Control, Decision, and Session in the workflow manager.

Modifying the shell/Perl scripts as per the business requirements.

Tuning Informatica Mappings and Sessions for optimum performance

Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.

Partitioned sessions for concurrent loading of data into the target tables.

Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.

Used Shell Scripting to automate the loading process.

Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.

Co-ordinate between different teams across circle and organization to resolve release related issues.

Environment: Informatica Power center 8.6, Power Exchange 8.1, Oracle 10g, DB2 Query Analyzer, SQL *Plus, XML, Flat Files, Windows, IBM UNIX(AIX), TOAD.

Benefit Harbor LLC, Dallas Jan11 – jun13

Informatica Developer

Benefit Harbor, is an insurance company, is specializing in providing various insurance products to Personal, Commercial and Auto insurance around the world. The main objective of this project is to deliver the data warehouse as a central repository to access the data from segments to make data available for analysis and reporting. It focuses on providing information for analysis on new business growth. Strategic Package entails capturing data from identified sources of Quote, Issue and Distributor data, loading the data into a data warehouse. Extracting the necessary data from various enterprise data sources defining, standardizing and measuring the quality as it is loaded into DWH, the providing a common toolset to be used by business users to measure results, solve problems, analyze and take actions to make world class BI program.

Responsibilities:

Analyzed Business and Systems Requirements documents and other documents as a preparation for the creation of required System Design Documents.

Involved in Dimensional Data Modeling design and populating the Business rules into integrated data warehouse.

Created logical and physical data models for Business users to determine data definitions and establish referential integrity of the system.

Experience of working on Fact Tables and Dimensional tables.

Implemented the concept of Slowly Changing Dimensions concept with various transformations, mainly Update Strategy

Used Informatica to build the data warehouse. Performed various activities related to Informatica Server Administration (Installing and configuring Informatica Server and Client, backup, restore, Data migration & Repository Promotion).

Used to work with the Data Migration between various Repositories.

Extensively used and created Complex Mappings using Source Qualifier, Aggregator, Expression, Joiner, Lookup, Router, Filter, JAVA and Stored Procedure transformations.

Used Star schema for the data warehouse.

Created and Monitored Sessions using Informatica Power Center. Designed Sequential and concurrent workflows for whole process.

Migrated the objects between various repositories and maintained the repository.

Involved in working with heterogeneous data sources like flat files, DB2 and Oracle, SQL server.

Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the activities.

Tuned the SQL queries used in the SQL overrides of the Source Qualifiers.

Wrote SQL, PL/SQl, Stored procedures, triggers and packages.

Used Partition by Range, List partitions for better performance.

Constantly assisted Business Users in data validations for any data issues observed and mismatches.

Involved in Unit testing, system testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements

Performed Unit tests on the mappings.

Used SQL Loader,loaded MS Excel data into Oracle.

Responsible to tune ETL procedures and schemas to optimize load and query Performance.

Worked with Autosys Scheduling team to schedule jobs daily and weekly based on Feedback from Business Analysts

Environment: Informatica Power center 8.6.1, DB2, V2R3, SQL Server 2008, Oracle 11g/10g, DB2,Business Objects XIR2/6.5/5.0, PL/SQL, Power exchange 8.1, TOAD, Windows XP, UNIX maestro, ERWIN 4.2, SunSolaris10, Hyperion BRIO reporting, XML, Autosys.

Capital One, VA Jul 08 – Dec10

Informatica Developer

The goal of project is to provide clients the ability to view their account information online, including net positions (assets minus liabilities), assets and liabilities as of previous day’s close, and account transactions (debits and credits) since the client’s last [official print] statement cutoff. The present work involved the design and development of a Data Warehouse of all the investment data and the returns of the investment made by the investment management system at Private Banking. Due to market fluctuations this investment data varies constantly and needs to be updated monthly, quarterly and yearly for customers to have an idea of the returns from their investments. Since most data is stored in heterogeneous sources like Sybase, Oracle, DB2, and COBOL flat files Informatica power center 8.1 was used for migrating data to an Oracle database (target).

Responsibilities:

Involved in system analysis & design of the Enterprise data warehouse implementation, Requirements gathering and understanding the business flow.

Worked with BAs for requirements gathering, documentation, business analysis, testing, and project coordination.

Involved in the requirements definition and analysis in support of Data Warehousing efforts.

Created the mapping documents based on the data model and the client requirements.

Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.

Developed Informatica mappings and also tuned them for better performance. Involved in designing the procedures for getting the data from all systems to Data Warehousing system.

Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data. Used most of the transformations such as the Source qualifier, Aggregators, Lookups, Filters, Sequence and Update strategy.

Created Pre & Post session commands & shell scripts, which will create the parameter, file dynamically.

Used ETL to extract and load data from Oracle, MS SQL Server and flat files to Oracle 10g. Wrote PL/SQL Program units for data extracting, transforming and loading. Created and ran Sessions to load the data into the Data warehouse components.

Developed and documented data Mappings/Transformations and Sessions.

Interacting with the users and troubleshooting the problems involved with the development of stored Procedures, triggers and problems related to the privileges.

Responsible to tune ETL procedures and schemas to optimize load and query Performance.

Developed Schell scripts for Daily and weekly loads and coordinated with scheduling team.

Involved in Unit testing, system testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Created Unit test cases and worked with system test team for QA Environment testing, test plan creation.

Worked with Informatica Admin team to migrate repository objects across different Environments.

Extensively used SQL Plus for Data validations and Unit Testing of Applications.

Worked with Production support team and implemented fixes/solutions to issues/tickets raised by business users.

Worked with Business users to validate Reports.

Environment: Informatica Power Center 8.1.1, Oracle 10g, MS SQL Server 2005, DB2,MS Visio, Windows 2000, and Toad 7.0,Business Objects 6.5/6.0, Erwin Designer 4.1,Hyperion BRIO reporting, Power exchange 7.1

Bank of Baroda, INDIA Jun 08-Jul 09

Role: ETL Developer

The bank offers a full state of retail and commercial banking services from checking, savings and loans for retail customers, to private banking client services and treasury, corporate and trade finance for the wholesale customers. This system consisted of New Depositor, Interest Warrant, Maturity, Premature & Renewal, and Enquire modules. The system was capable of calculating the annual percentage rate, interest period, and amount due on maturity date. Banking maintenance system generated fixed deposit receipts, duplicate fixed receipts, interests warrants, duplicate interest warrants, premature withdraw receipts, maturity receipts and monthly statements

Responsibilities:

Studied the project requirement and designed the project.

Created tables, views, synonyms and sequences.

Creating Mappings for transforming the data.

Creating Mapplet for transforming the data.

Created Database Triggers, Stored Procedure, Functions and Packages

Optimized queries using rule based & cost based approach.

Used SQL Loader for data loading.

Various Performance and Tuning techniques used to improve the load.

Involved in Testing the Informatica mappings and creating QA Test Plans.

Modified existing forms, reports, and graphs as per the enhancement.

Tested all the new and modified program units.

Environment: Informatica 5.1, Oracle 8i, PL/SQL, UNIX, Microsoft Windows Server 2000.



Contact this candidate