KAMALAKAR
Sr ETL Informatica Developer
Email ID: ************@*****.***
Phone No: 818-***-****
SUMMARY:
* ***** ** ** ********** in Environment Setup, Design, Development and Implementation of Data Warehouse using ETL Informatica PowerCenter.
Strong Data Warehousing experience using Informatica with extensive experience in designing the Workflows, Work lets, Mappings, Configuring the Informatica Server and scheduling the Workflows and sessions using Informatica PowerCenter 9.x/8.x/7.x/6.x.
Good experience in all phases of software development life cycle (SDLC) including system design, development, integration, testing, deployment and delivery of applications.
Extensive Knowledge in architecture design of Extract, Transform and Load environment using Informatica Power Center.
Good knowledge in Data Modeling (Physical & logical data modeling) using data Modelers.
Good expertise in Data Analysis, Data Cleansing and Data Validation
Experience in Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2 and XML Files.
Created complex and reusable mappings using Lookup, Joiner, Aggregator, Stored procedure, Update Strategy and various other transformations.
Experience in Performance Tuning of sources, targets, mappings, transformations and sessions.
Good working knowledge on Teradata components such as FLOAD, MLOAD, BTEQ and Fast Export.
Involved in Data Profiling using Informatica Data Quality (IDQ).
Hands-on experience in Star and Snowflake schema design.
Successfully implemented slowly changing dimensions (SCD Type1/2/3).
Expertise in identifying and designing facts and dimensions.
Expertise in testing Datawarehousing strategies for ETL mechanism developed by Informatica.
Experience in Integration of various data sources like Oracle, Teradata, SQL Server, DB2 and Flat Files.
Extensively experience in UNIX shell scripting and Job scheduling using Control-M & AutoSys.
Good experience in writing SQL Queries and creating stored procedures, PL/SQL Packages, Triggers and Functions. Strong knowledge in Oracle cursor management and exception handling.
Expertise in databases like Teradata, Oracle 9i, 10g, SQL Server, MySQL and DB2.
Expertise in TOAD(Tool for Oracle Application Developers) and PVCS (Version Control)
Experience in working on different operating systems like Windows 2007/Vista/XP/2003/2000 and UNIX.
Good understanding in database and data warehousing concepts (OLTP & OLAP).
Skilled in applying various forms of testing to ensure product quality including: Unit Testing, Integration Testing and Quality Assurance Testing.
Experience in interacting with Business users in analyzing the Business process requirements and Transforming them into documenting, designing, and rolling out the deliverables
Excellent organizational skills and ability to prioritize workload.
TECHNICAL SUMMARY:
ETL Tools : Informatica Power Center 9.5/9.1/8.6.1/8.5/8.1/7.x/6.x, Source Analyzer, Mapping Designer, Workflow Monitor, Workflow Manager, Data Cleansing, Data Quality, Informatica Analyst 9.1, Informatica Developer 9.1, Repository, Metadata, Data Mart, OLAP, OLTP, Web services, Power Mart 5.1/4.7.2, Informatica Data -Analyzer 9.1, Power Exchange 8.
Data modeling tool: Erwin 7.x/ 4.2/3.x, Microsoft Visio
RDBMS: Oracle 11g/10g/9i/8i, IBM-DB2, MS SQL Server, Teradata, Netezza 6.0.4 MS Access
Web Tools: XML, Java Script
Other tools & Utilities: SQL* Plus, TOAD, Auto Sys
Languages: SQL, PL/SQL,C and C++.
Scripting languages: UNIX Shell Scripting, Perl Scripting
Operating systems: Windows 2000/2003/NT/XP, UNIX (Sun Solaris, AIX), LINUX
EDUCATION:
Bacherlors of Technology,JNTU University,India.
PROFESSIONAL EXPERIENCE:
Client: Walmart, Bentonville,AR
Sep 2014 – Present
Role: Sr ETL Informatica Developer
Responsibilities:
Designing Complex mappings by extensively using Informatica Transformations.
Experience in working with RDBMS databases for extracting the files and loading them into different databases.
Writing UNIX Shell Scripts for running Informatica workflows and sessions(pmcmd commands), file manipulations, housekeeping functions and FTP utility program.
Experience in UNIX shell scripting and Job scheduling using AutoSys.
Experience using Informatica IDQ for qualifying the data content and MDM to filter duplicate data as well as to deploy the project as well as Meta Data Management.
Writing PL/SQL packages, stored procedures and functions using new PL/SQL features like Collections, Objects, Object Tables, Nested Tables, External Tables,Merge, Intersect, Minus, Bulk Into and Dynamic SQL commands.
Extracted data from various sources to targets like DB2, Teradata, Flat files, Oracle and XML.Transformed and Loaded the data in to target database using Informatica Power Center.
Upgraded Informatica 9.5.1 Standard Edition to Informatica 9.6.0 Advanced Edition successfully without any defects/issues for Environments. Configured Repository Service, Integration Service, Web Service, Model Repository Service, Data Integration Service, Analyst Service, Content Management Service, Metadata Manager Service.
Developed database monitoring and data validation reports in SQL Server Reporting Service (SSRS).
Configure/Create mail box ( MFT object ) with proper user ID and password.
Accomplished data movement process that load data from databases, files into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD.
Developing Mappings to process files from file partners, mappings to extract data from Oracle, MS SQL Server, MYSQL, MS Access, XML and loading it into Oracle warehouse.
Recommended Best Practice for ETL coding standards, ETL naming standards.
Cleansed and standardized data using Informatica IDQ transformations like Standardizer, Parser, Match, Merge & Consolidation Transformations using IDQ.
Performed the code migrations (Deployment Group and Import/export).
Coordinated with Oracle DBA and UNIX admins on Oracle upgrade and server related issues respectively.
Used Teradata Utilities BTEQ, FastLoad, T-pump, Multi Load utilities for loading bulk data.
Created the users and user groups as per the request and customized groups for migration verification.
Created ETL connections strings, folders and provide the appropriate privileges to users.
Used IDQ transformations like Parser, Standardizer, Match and Consolidation transformations for cleansing of data and loaded into stage tables.
Handled application release activities and production release for the application users.
Environment- Informatica 9.6.0 & 9.5.1, Windows-XP (Client), Informatica Data Quality 9.6.0 & 9.5.1, Oracle 11g, UNIX, MS SQL Server, MDM, Teradata, AutoSys,MS Access, PL/SQL, My SQL and flat files
Client:Xerox,Los Angeles,CA May 2013-Aug 2014
Role:Sr ETL Informatica Developer
Responsibilities:
Involved in design of database and created Data marts extensively using Star Schema.
Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
Involved in implementing the data integrity validation checks through constraints and triggers.
Involved in developing packages for implementing business logic through procedures and functions.
Responsible for developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations for Large volumes of Data.
Extensively involved in application tuning, SQL tuning, memory tuning and I/O tuning using Explain Plan and SQL trace facilities.
Creating high level and low level functional specification and Technical specification document for application development.
Identified and eliminated duplicates in datasets thorough IDQ 9.x components of Edit Distance and Mixed Field matcher. It enables the creation of a single view of customers.
Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
Developing Automation Shell Script programs for the job automation in UNIX environment to avoid manual intervention
Writing UNIX Shell Scripting for FTP Utility program for transferring files.
Created scripts to schedule the process of inbound feeds and outbound extract using Autosys.
Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logical data modelling.
Used Teradata Utilities BTEQ, FastLoad, T-pump, Multi Load utilities for loading bulk data
Performance of the queries is enhanced by executing optimization techniques such as index creation, table partition and coding stored procedures.
Used SQL tools TOAD to run SQL queries and validate the data in warehouse.
Performed load and integration tests on all programs created and applied version control procedures to ensure that programs are properly implemented in production.
Environment: Informatica Power Center 9.1, Windows-XP (Client), Informatica Data Quality (IDQ), Power Designer, Power Exchange, Work Flows, ETL, Flat Oracle 11g, Teradata, XML, SSRS, Webservices, MS SQL Server, TOAD 9.6.1, Unix Shell Scripts, Autosys
Client – CIGNA Healthcare, - Denver, CO
Feb 2012 - April 2013
Role: Informatica Developer
Responsibilities:
Analysis of requirements and finding the gap between use cases, User interface documents and ETL specifications.
Creation of mappings according to the use cases by following the ETL specifications.
Creation of Informatica mappings to build business rules to load data using transformations like Source Qualifier, Expression, Lookup, Filter, Joiner, Union, Aggregator, Sorter, Sequence generator, Router, Update Strategy, and Stored procedure.etc.
Extracted data from of different types of Flat files, Oracle.
Implemented Reusable transformations, mappings, User Defined Functions, Sessions.
Created and tested Power Exchange Data maps for Oracle CDC capture.
Involved in identifying the Power Exchange Oracle CDC limitations and supplementary solutions.
Creation of Unit, Functional, Integration and System test cases based on Requirement Specification Documents, Use Case Docs, PDM, and User Interface Specifications. .
Involved in Unit, Functional, Integration and System testing and preparation review documents for the same.
Involved in scheduling the jobs in Control M using UNIX scripts and even developed PL/SQL stored procedures.
Involved in loading all Batch Scheduling tables with the help of UNIX script.
Maintained all the documents in the clear case and CVS for Version control.
Created User defined functions (UDF) to reuse the logic in different mappings.
Created and tested shell scripts to automate Job scheduling by using commands like pmcmd.
Data investigation in the analysis of incoming data from the various source systems, documenting the data anomalies and generating Data Quality reports.
Developed and tested stored procedures, functions and packages in PL/SQL.
Involved in database testing, writing complex SQL queries to verify the transactions and business logic like identifying the duplicate rows by using SQL Developer and PL/SQL Developer.
Environment: Informatica PowerCenter v9.0.1, Oracle 10g/9i,Tidal Scheduler, Informatica servers on Unix, Netezza, PL/SQL Developer.
Cilent:Johnson & Johnson, Princeton NJ
Sep 2010 - Jan 2012
Role: Informatica Developer
Responsibilities:
Responsible for gathering the user requirements and discuss with Business Analysts to acquire the functional and Technical specifications
Analyzed the source data coming from Oracle ERP system to create the Source to Target Data Mapping.
Interacted with the Data Architect in Creating and modifying Logical and Physical data model.
Used Informatica Repository Manager to create Repositories and Users and to give permissions to users.
Worked with DBA's in optimizing the Major SQL Query's in the process of performance tuning.
Generated reports with parameters, sub reports, cross tabs, charts using Crystal Reports.
Responsible for identifying the bottlenecks and tuning the performance of the Informatica mappings/sessions.
Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
Designed and developed mapping using various transformations like Source Qualifier, Sequence Generator, Expression, Lookup, Aggregator, Router, Rank, Filter, Update Strategy and Stored Procedure.
Worked with Tidal Enterprise Scheduler to schedule jobs and batch processes.
Worked Extensively with Informatica Power Exchange in extracting the data from Mainframes
Used Autosys and Informatica Scheduler to schedule jobs for the files and other sources to be extracted and load to target EDW on a daily/weekly/monthly basis.
Created Unit Test plans and involved in primary unit testing of mappings.
Created UNIX shell scripts for Informatica post and pre session operations
Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.
Environment: Informatica PowerCenter 7.4/8.5, Oracle 10g, Teradata, Windows2000,Fast Export, Flat files, TOAD 8.6, UNIX.
WebSoftTechnologies,India
August 2009– July 2010
Role: Informatica Developer
Responsibilities:
Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
Extensively worked on Power Center Client Tools like Repository Admin Console, Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
Extensively worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
Analyzed the source data coming from different sources (Oracle, DB2, XML, Flat files) and worked on developing ETL mappings.
Good experience in installation of Informatica Power Exchange.
Developed complex Informatica Mappings, reusable Mapplets and Transformations for different types of tests in research studies on daily and monthly basis.
Implemented mapping level optimization with best route possible without compromising with business requirements.
Used Teradata fast loads for truncate load tables and mloads for insert, update and upsert options.
Created Sessions, reusable worklets and workflows in Workflow Manager and Scheduled workflows and sessions at specified frequency.
Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.
Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
Generated XML files to deliver to Thompson Reuters.
Performed Data profiling for data quality purposes.
Proven Accountability including professional documentation, and weekly status report.
Performed Quantitative and Qualitative Data Testing.
Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.
Environment: Power Centre 7.4, DB2, Oracle 10g, UNIX, Win XP Pro, TOAD, Autosys, Cognos 7.2\
Client: ISpace Global, India
June 2008- July 2009
Role: ETL Consultant
Responsibilities:
Worked in various types of transformation like Lookup, Update Strategy, Stored Procedure, Joiner, Filter, Aggregation, Rank, Router and XML Source Qualifier etc of Informatica to bring the data from different databases like DB2 UDB, Sybase and Oracle.
Creating Mapplets to reduce the development time and complexity of mappings and better maintenance.
Reviews, analyzes, and modifies programming systems including encoding, testing, debugging and documenting programs.
Developed Complex mappings by extensively using Informatica Transformations.
Developing Reusable Transformations, Aggregations and created Target Mappings that contain business rules.
Creating/Building and Running/scheduling workflows and work lets using the Workflow Manager.
Involved in the phases of unit and system testing.
Extensively used Informatica debugger to validate mappings and to gain troubleshooting information about data and error conditions.
Worked on performance tuning by creating indexing, hints, used explain plans and analyzing the database.
Running the SQL scripts from TOAD and creating OracleObjects like tables, views, Indexes, sequences, synonyms and other OracleObjects.
Worked for a combination of Relational and dimensional data modeling.
Worked with offshore team by helping them to understand the requirements
Had co-ordination with off shore team in maintaining the different databases.
Extensively used UNIX shell scripts and called the shell script in the workflow manger through command task.
Environment: Informatica Power Center 7/8.1.1, Business Objects, MS SQL Server 2000, DB2 UDB, TOAD, Oracle9i, UNIX, Windows 2000, Autosys, SQL Loader.