Anil Kumar
(Senior Certified ETL Informatica Developer)
(Contact: 732-***-**** - email: **************@*****.***)
Profile:
* ***** ** ***** ** experience
* ***** ** *** **** Informatica experience
9 years of Oracle experience
Summary:
Nine (9) Years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Financial, Insurance, Banking, Manufacturing and Telecom Sectors.
Nine (9) Years of strong Data Warehousing ETL experience of using Informatica Power Center Designer 9.0.1/8.6/8.5 (Source Analyzer, Repository Manager, Data warehousing designer, Mapping Designer, Mapplets, Transformations, Workflow Manager, Task Manager), Power Connect, Power Analyzer, OLAP, OLTP, Autosys 4.5, Control M,TWS(Tivoli Workload Scheduler), Maestro, OLAP, OLTP, MOLAP, ROLAP, ETL, ETI, Data marts, Data mining.
Dimensional Data Experience Star/Snowflake schema, FACT & Dimensions tables, Physical & logical data modeling and. Hands on experience in Star and Snowflake schema design in relational, dimensional and multidimensional modeling and De-normalization techniques. Thorough understanding of Kimball and Inmon methodologies.
Database experience using Oracle 11g/10g/9i/8i/7i, Sybase, IBM DB2 UDB 8.0/7.0, MS SQL Server 6.5/7.0.2000, MS Access 7.0/2000, flat files, SQL*Plus, SQL*Loader and Developer 2000, Windows 3.x/95/98/2000, Win NT 4.0 and Unix, AIX, Sun Solaris 2.x.
Worked with testing team to create test cases for unit and integration testing. Expertise in Winsql, TOAD (Tool for Oracle Application Developers)
Experience with reports and dashboard development using tools like Pentaho report designer, Pentaho C tool
Implement Informatica MDM including data profiling configuration specification and coding match rules tuning migration.
Experience in transforming unstructured data from various data sources using SSIS like Conditional Split, Lookup, Merge Join and Sort.
Have knowledge in importing ORS data source in the application server environment for Informatica MDM Hub.
Experience in debugging the SSIS packages using breakpoints and checkpoints.
Extensive experience in leading offshore team.
Extensively worked on Actimize Reports.
Proficient in Analyzing Business processes requirements and translating them into technical requirements.
Upgraded Informatica v7.1.3 to v8.6.1 applying hot fix patches.
Expertise in User Acceptance Testing Consultation with SME’s.
Extracted data from various source systems.
Excellent understanding of Client/Server architecture makes a valuable asset to any project, with Excellent Logical, Analytical, Communication and Organizational Skills.
Excellent team player and can work on both development and maintenance phases of the project.
Highly skilled in providing instantaneous and workable solutions to business problems.
Efficient team player with excellent communication skills and good interaction with users.
Education & certifications
Bachelor of Computer Science, Osmania University, India (2007)
Brain bench certified Informatica Professional.
Technical skills:
Data Warehousing
Informatica Power Center 9.0.1/8.6/8.5 (Source Analyzer, Repository Manager, Data warehousing designer, Mapping Designer, Mapplets, Transformations, Work flow Manager, Task Manager), MS SQL Server 2000 Analysis Services, Mappings, Mapplets, Transformations, Autosys, Control M, TWS(Tivoli Workload Scheduler), Maestro, OLAP, OLTP, MOLAP, ROLAP, ETL, ETI, Data marts, Data mining, Data Cleansing, Data Profiling. Data Lineage.
Data Modeling
Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling, Star & Snowflake schema Relational, dimensional and multidimensional modeling and De-normalization techniques, Kimball & Inmon Methodologies.
Business Intelligence
Tools/Utilities
SQL*Loader, TOAD 7.x/8.x/9.x, Win SQL, SQL Navigator, Ultra Edit 32, Winscp, Harvest, TFS, MS Project 2000, Microsoft Visio, MS Source Safe, Brio 6.2, ERP, Win runner, Load Runner, Test Director.
Databases
Oracle 11g/10g/9i/8i/7i, MS Access 7.0/2000, flat files, SQL Server 2005,
Languages/Web
SQL*Plus, HTML, ASP, Java, Perl, Shell Scripting and MS-Excel, Power Builder, XML, XSLT, HTML, Unix Korn Shell Scripting, Angular
Environment
HPUX 10:20/11, AIX 4.5, Solaris 2.x, UNIX SVR 4.0, Linux, SCO UNIX, MS Windows, Windows NT. Red Hat Linux 7.2, SUN SOLARIS
Professional Experience:
Client: ALLY Financials, Troy, MI & Mt. Laurel, NJ
Duration: August 2015 – Till date
Job Title: Senior ETL Informatica Programmer and Analyst
Description:
Working as an ETL Developer with Ally Financials for Ally auto advantage program. The project named as Reporting and Data Services (RADS) using Agile Methodology. The main goal of the project was to Analyze and represent the needs of entire AF community for quality reporting data from the consolidated Ally Finance data marts, Auto finance retail. Data comes through source systems and passes through net change to staging and then applied conversions in operational layer and finally loaded into various data marts. then transformed and loaded it into a centralized data warehouse for various strategic business reports. The company’s data come from different source systems such as Oracle, DB2-UDB, SQL Server, Flat files and XML, and is then loaded into Oracle as target database using Informatica-Power Center.
Responsibilities:
Analyzing business needs and implement the same into a functional database design.
Extracted data from Sales department to flat files and load the data to the target database.
Extensively used IDQ (Informatica Data Quality), Informatica Real Time suite for Data Cleansing and Data Analysis purposes and maintained warehouse metadata, naming standards and warehouse standards using Metadata Manager for future application development.
Upgraded Informatica v7.1.3 to v8.6.1 applying hot fix patches.
Parsing high-level design spec to simple ETL coding and mapping standards.
Developed Pentaho ETL's to load data from different sources to ODS.
Worked on Informatica Power Center real time tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
Developing processes using C# and Visual Studio 2012.
Developed IDQ, IDE plans for Data Consistency and Data Migration purposes.
Using IDQ (Informatica Data Quality) identified relationships between data records to eliminate duplicates before consolidation.
Involved in integrating different Database Applications using SSIS packages.
Creating SSIS packages that involves migration from legacy systems to centralized Database.
Involved in IDE (Informatica Data Explorer) Profiling for cleansing the migration data.
Developed new or maintain existing Informatica mappings and workflows based on specifications.
Used Informatica power exchange CDC (change data capture) option to capture the updates to the existing data.
Defined, configured and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
Experience in Installing Informatica MDM HUB server/Cleanse Server.
Experience with MDM Hub configurations - Data modeling & Data Mappings, Data validation, cleansing, Match and Merge rules.
Develop, implement and maintain (UAT) unit tests and User acceptance testing (UAT) for applications.
Leading the design of new or changing mappings and workflows and producing requisite specifications.
Configuring Pentaho(Reporting tool) and installed it on Weblogic10.0.
Extensively used explain plans to get the statistical data for the performance tuning of the Oracle Transact Sql queries and PL/SQL for stored procedures.
Developed PL/SQL Procedures to drop Indexes, Build Indexes, Truncate Table, and Analyze Table trigger the BO reports.
Participated in problem solving and troubleshooting for the applications implemented with Informatica.
Developed UNIX Korn shell scripts for archiving, zipping and cleanup bad files data from logs.
Involved in administrative tasks within Informatica platform such as stopping/starting Informatica Repository and Server processes, performing code migration, backup and restore of Informatica Repository, and security administration.
Participated in cross-functional efforts to support other teams - such as ETL and database tuning to support Business Objects reporting performance
Sharing knowledge by effectively documenting work.
Maintained expert level of knowledge in area of expertise.
Worked with the team to ensure the quality of coordinated business functions.
Responding quickly and effectively to production issues and taking responsibility for seeing those issues thru resolution.
Responsible for Lead Role and Mentoring Junior Developers.
Environment:
Informatica Power Center 8.6.1/7.13, Informatica Power Exchange (Change Data Capture), Informatica Real time suite, Oracle 10g, DB2-UDB, Toad, C++, SQL, MDM, PL/SQL, IDQ, Pentaho, SSIS, IDE, Visio, Business Objects XI, Winsql, Sql Navigator, AIX 5.2, Harvest, Autosys, Unix Korn Shell scripting, HP Quality Center, Windows XP.
Client: Medtronic, Minneapolis, MN
Duration: Sep 2013 – July 2015
Job Title: ETL Informatica Designer, Programmer and Analyst
Description:
The project involves planning, development, implementation, maintenance and supporting BI & ETL applications for Enterprise wide Data warehouse (EDW) section. Involved in the data-warehousing project, which consolidated the company's data from different operational sources, data mart and then transformed and loaded it into a centralized data warehouse for various strategic business reports.
The company’s data come from different source systems such as Oracle, DB2-UDB, SQL Server, Flat files and XML, and is then loaded into Oracle as target database using Informatica-Power Center. Various Business Rules and Business Processes were applied to Extract, Transform and Load the data into the Data Warehouse.
Responsibilities:
Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications.
Worked with Financial end users to gather the Functional Specifications.
Involved in analyzing existing logical and physical data modeling with STAR and SNOWFLAKE schema techniques using Erwin in Data warehouse.
Partitioned tables which have frequent inserts, deletes and updates to reduce the contention and to improve the performance.
Maintained development, test and production repositories using repository manager. Also used Repository manager and Metadata Manager to maintain the metadata, Security, Backup and Locks
Designed the Data Warehousing ETL procedures for extracting the data from all source systems to the target system.
Extensively used Transformations like Router, Lookup (connected and unconnected), Update Strategy, Source Qualifier, Joiner, Expression, Stored Procedures, Aggregator and Sequence generator Transformations.
Worked on IDQ file configuration at user’s machines and resolved the issues.
Worked extensively with dynamic cache with the connected lookup Transformations.
Designed and optimized the Mapping to load the data in slowly changing dimensions.
Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
Designed SSIS packages using various control flow task like Data Flow Task, For Each Loop, Execute SQL Task. Actively involved in deploying and scheduling packages.
Effectively implemented Data Extraction and Loading from OLTP onto staging with SSIS Packages.
Responsible for designing and developing ASP.NET web pages using ASP.NET, Java script, ADO.Net, SQL, HTML, CSS and AJAX Web services.
Understanding the business and MDM requirements and taking it forward for the Estimation.
Worked with other teams on Facets Data model design and Facets batch processing.
Created, scheduled, and monitored workflow sessions on the basis of run on demand, run on time, using Informatica Power Center workflow manager.
Used workflow manager for session management, database connection management and scheduling of jobs.
Developed batch Scripts to execute OWB (Oracle Warehouse Builder) sessions to automate, using PMCMD utility.
Designed and created Oracle Database objects, tables, Indices, Views, Procedures, Packages, and Functions.
Configured the session so that Power Center Server sends an Email when the session completes or fails.
Debugged and sorted out the errors and problems encountered in the production environment.
Determined various bottle necks and successfully eliminated them to great extent.
Extensively worked in the performance tuning of the programs, PL/SQL Procedures and processes.
Wrote and modified Korn Shell scripts to handle dependencies between workflows and log the failure information.
Scheduling and monitoring automated daily and weekly jobs in Maestro.
Developed UNIX Korn shell scripts to execute workflows sequentially.
Involved in performing (UAT) Unit testing and Integration testing.
Environment:
Informatica Power Center 8.5/7.1.3,PL/SQL, Unix Korn Shell Scripting, Business Objects, Crystal Reports, Maestro 4.0, Toad, Trillium, IDQ, Oracle 10g/9i, MDM, SSIS, DB2-UDB, SQL Server 2000, XML, Asp.net, Unix AIX 5.2, Windows XP.
AmeriCredit (GM financials), Dallas TX. Duration: Oct 2012 to Aug 2013
Informatica Software Developer
Description:
The AmeriCredit data Warehouse team which is referred to as ETL and DQ teams takes care of the support activities related to Informatica processes that are mission critical for AmeriCredit Financials. ETL team is responsible for supporting these processes to make sure that the issues encountered are addressed & resolved. As part of this work, ETL team is also responsible for taking care Remedies, CR’s and AS400 migrations. Various enhancements has been done to increase the performance. DQ team is responsible for table load delays, raising conflicts to ETL regarding the table loads, sending mails to users on the table load status, sending DQ Verifications Daily Status Report.
Responsibilities:
I have been given KT as my project started from Dec 28th.
Knowledge transfer on various support procedures followed by AmeriCredit.
Studying the various AmeriCredit KT support documents.
Have been back up for both ETL and DQ (Data Quality) teams.
Worked on Informatica 8.6 creating various mappings, sessions, worklets, and workflows.
Extensively used Informatica Power Center and Talend Open Source and created mappings using transformations like Source Qualifier, Joiner, Normalizer, Aggregator, Expression, Filter, Router, Normalizer, Lookup, Update Strategy, and Sequence Generator.
Done migrations from AS400 to Informatica, there are some processes running with less performance and taking long hours in AS400. These have been identified and the AS400 RPG code has been migrated to Informatica where now it runs with better performance and completes in acceptable time.
Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
Done DQ work like preparing CPM/Everest reports, sending the delay mails, taking care of the conflicts occurred in table loads and sending mails to ETL team regarding this and also sending the DQ daily verifications report. Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
Worked on full time ETL from July, completed various migrations, change requests from clients and taken care of remedies.
Used various heterogeneous sources like Oracle, Flat files, AS400 power exchange to develop the code in Informatica. Used various transformations like look up, joiner, expression, filter, update strategy, router, sorter, aggregator, source qualifier to develop the code.
Worked on GM leasing project where 797 flat files are loaded into Oracle tables using various worklets, mappings, sessions. Post production issues with source have been reported to client. Monitoring the workflow for post-production issues.
Interacted with client to clarify the queries and complete the development.
Total SDLC cycle has been implemented in the migration of CCARE process.
Also, involved in adding the missing columns in XML.xsd source and updated the mapping with the columns.
Sending Daily Informatica support handover at the end of day to the client, taking care of weekly KPI’s, Documentation changes from clients, sending monthly job report and explaining the cause of table load delays to DQ (Data Quality) team in the table load delay email.
Acted as primary offshore support person which rotates alternate week.
Received client accolade for improving the performance of critical job which is running for 3 hours to 15 mins and making the table loads ready by the time business users started their reporting. Developed PL/SQL procedures, functions to facilitate specific requirement.
Migrated SQL server packages into Informatica and the SQL server packages has been retired.
Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
Environment:
Power Exchange 9.5, Oracle 11i/10g, MS SQL Server, XML Files, SQL, PL/SQL, Windows XP, UNIX, Informatica 8.6/9.0, SQL Developer, BMC Remedy Tool, Putty, AS400, Rapid SQL tool.
GE Corporate, Hyderabad, India
Duration: June 2010 to Sep 2012
Financial Data warehouse
Responsibilities:
Analyze the client requirements and create the technical specifications.
Assessed top-down versus bottom-up approach based on the requirements and situation; decided to go ahead with the top-down approach.
Participate in data warehouse architecture; design the different components of the DWH and process of loading the DWH from different sources.
Optimized the loading process by implementing an incremental loading process using the change data capture of oracle source system and merge feature in the oracle target system
Involved in discussion and decision making to identify the attributes required to trigger history for slow changing dimensions, designed and implemented type 2 slow changing dimensions.
Designed a robust model to maintain version of balances to be able to report balances before and after each recast/ adjustments.
Analyzed source Oracle GL data and designed mappings from the base GL tables to the attributes of DWH.
Designed cubes in Oracle Analytic workspaces with value-based ragged hierarchies.
Responsible for developing code according to the technical design of the solution when ETL is required.
Work with the Tech Lead to manage the technical implementation of the solution.
Support the operations teams when required.
Develop, test and maintain all ETL maps /scripts and physical data models.
Load data for testing throughout the project lifecycle.
Plan and execute deployments across all environments.
Resolving design/development issues and interacting with infrastructure support partners (DBA, Sys Admins).
Implemented custom transformation using PL/SQL packages for Data reconciliation.
Extensively involved in testing by writing QA procedures for testing the target data against source data.
Environment:
Informatica Power Center 7.1, (Source Analyzer, Data Warehousing Designer, Debugger, Mapping Designer, Mapplet Designer, Transformations), Oracle 10g/11g, Toad, Oracle Discoverer 10g plus, Oracle Discoverer Viewer, SQL, PLSQL.
IVY Comptech, Hyderabad, India
May 2008 to May 2010
Associate
Responsibilities:
Involved in unit testing and validating the performance of the application and interacted with developers to check performance of the application within different departments as Transaction Customer support, Poker customer support, Risk management and credit, Investigations and transaction processing.
Develop the ETL transformation specifications indicating the source tables, Flat Files, data types, Transformation required/business rules, target tables, columns, and data types.
Interact with end users for gathering requirements, Changes in existing requirements and Document accordingly.
Designed the Data Warehousing ETL procedures for extracting the data from all source systems to the target system.
Support of Production, Development and Test Environments of Oracle Databases for multiple applications.
Monitoring and improving performance of daily jobs on Oracle database.
Developed procedures and functions using PL/SQL and developed critical reports.
Environment
Oracle DB 9i/10g, SQL, PL/SQL, Informatica 7.1, Erwin.