Post Job Free

Resume

Sign in

Informatica Developer

Location:
Guntur, AP, India
Posted:
January 17, 2017

Contact this candidate

Original resume on Jobvertise

Resume:

Kumar

: acx7ye@r.postjobfree.com

: 732-***-****

Professional Summary

Over 8 years of experience as the Sr. ETL/Informatica developer.

Good experience in the Analysis, Design, Development, Testing and Implementation of business

application systems for Health care, Insurance and Financial purpose.

Experience in Business Intelligence solutions using Data Warehousing/Data mart design, ETL and

reporting tools.

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and

Workflows using Informatica Power Center.

Experience in Dimension Data Modeling, Erwin Modeling (Erwin), Microsoft Visio, Ralph Kimball

Methodology, Bill Inmon Methodology, Star Schema, Snowflake Schema, Data Warehouse, Data Mart,

FACT tables, and Dimension tables.

Experience in working with DataStage Manager, Designer, Administrator and Director.

Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources

into Data Warehouse and Data Marts using Informatica Power Center.

Hands on Experience working with scheduling tools like Autosys, Control M, Maestro Cleansing tools.

Experience with Informatica Data Quality and Master Data Management products.

Experience in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2

8.0/7.0, MS Access and Teradata.

Designed and developed a process that loaded source to staging and staging to destination using many SSIS

transformations at the staging table level and table partitioning to optimize the load process within SSIS.

Designed parallel jobs using various stages like Join, Merge, Look up, Filter, Remove Duplicates, Data

Set, Look up File Set, Complex Flat File, Modify, Aggregator, XML.

Good Experience in developing complex mapping using transformations like Source Qualifier, Router,

Filter, Expression, Sorter, Aggregator, Normalizer, Joiner, Sequence Generator, Connected and

Unconnected Lookup and Update Strategy, XML Source Qualifier.

Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers,

Views, and Indexes in distributed environment.

Been part of various integration, reporting, migration, enhancement engagements.

Proficient in developing SSIS Packages to Extract, Transform and Load (ETL) data into the Data Warehouse

from Heterogeneous data sources such as Oracle, DB2, Excel and MS Access CSV, Oracle, flat file.

Experience in requirement gathering, planning, analysis, design, implementation, unit testing, code

migration, production support with expertise in developing business applications much based on RDBMS.

Proficient in the integration of CRM data source such as Sales force and good knowledge on insert update.

Experience in leading teams, good understanding of onsite-offshore environments.

Experience in the use of agile methodologies with SCRUM, Sprint.

Strong understanding in UNIX Shell scripts and writing SQL Scripts for development, automation of ETL

process, error handling, and auditing purposes.

Worked with cross-functional teams such as QA, DBA and Environment teams to deploy code from

development to QA and from QA to Production server.

Involved in analyzing the source data coming from different Data sources such as XML, DB2, Oracle, flat

files.

Excellent analytical, problem solving skills with strong technical background and interpersonal skills

Technical Skills & Soft Skills

Data Warehousing Informatica Data Quality, Informatica Power Center, IBM Web Sphere Data Stage

and Quality Stage.

Reporting Tools Business Objects.

Databases Oracle, MySQL, MS SQL Server, Sales force, Teradata.

Languages/Web SQL, PL/SQL, SQL*Plus, HTML, ASP, Java, Shell Scripting and MS-Excel, XML,

XSD, XSLT, HTML, Unix Korn Shell Scripting.

GUI Tools SQL*Loader, TOAD, Data Loader Tool.

Modeling Tool Erwin, Visio

Environment HPUX, AIX 4.5, Solaris 2.x, MS Windows, Windows NT.

Academic Qualification

Bachelor of Technology from Osmania University.

Work Experience

Client: BBVA Compass Bank (Birmingham, Alabama) April 2015

Till Date

Sr. ETL/Informatica Developer

Responsibilities:

Analyzed the business requirements.

Created shell scripts to fine tune the ETL flow of the Informatica workflows.

Used Informatica file watch events to pole the FTP sites for the external mainframe files.

Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

Employed Oracle database to create and maintain Data Marts.

Performance tuning was done at the functional level and map level.

Designed and supervised the overall development of the Data Mart.

Worked extensively with Informatica power Center mappings to develop and feed Data Mart.

Used relational SQL wherever possible to minimize the data transfer over the network.

Used various transformations like Filter, Expression, Aggregator, Sequence Generator, Source qualifier,

Update Strategy, Joiner, Normalizer, Router, XML generator, XML Source qualifier, Connected Look

up, Unconnected lookup, Stored Procedure and Union to develop mappings in the Informatica Designer.

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP

connections and relational connections.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of

stored procedures for code enhancements.

Maintained Data Marts up-to date and in accordance with the Companys requirements and practices.

Created Mapplets to use them in different Mappings.

Effectively worked in Informatica version based environment and used deployment groups to migrate the

objects.

Used Stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove

Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key,

Column Generator and Row Generator to create Data Stage Jobs.

Developed PL/SQL packages, procedures, triggers, functions, indexes and collections to implement the

business logic using SQL Navigator.

Involved in creating, maintaining and tuning views, stored procedures, user defined functions and system

functions using T-SQL.

Generated server side PL/SQL scripts for data manipulation and validation and materialized views for

remote instances.

Designed, implemented and tuned interfaces and batch jobs using PL/SQL.

Involved in designing data warehouses and data marts using Star Schema and Snow Flake Schema.

Effectively worked on Onsite and Offshore work model.

Created the Shell Scripts and updated the scripts as per the requirement.

Involved in the creation of Job Sequences.

Pre-and post-session+ assignment variables were used to pass the variable values from one session to other.

Used bash, awk, sed and perl to automate most of the daily activities like log monitoring, log rotation and

purging, proactive system monitoring.

Designed workflows with many sessions with decision, assignment task, event wait, and event raise

tasks, used Informatica scheduler to schedule jobs.

Also, involved in creation and scheduling of T-SQL jobs to run daily.

Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble

shooting.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Identified problems in existing production data and developed one time scripts to correct them.

Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica Power Center 9.6.0, Control-M, COBOL files, Oracle 11g, TOAD, Unix Shell

Scripting, Putty, Winscp, IBM DB2 8.0, PL/SQL, SQL Server 2012.

Client: PAYPAL (San Jose, California) May 13

Feb 15

ETL Developer

Responsibilities:

Involved in the Design, Development, Testing, Analysis, Requirements gathering, functional/technical

specification, deploying.

Worked with Business Analyst to understand the business and to design the architecture of the data flow.

Developed Logical and Physical data models that capture current and future state data elements and data

flows.

Designed logical and physical models for star schema based data marts using Erwin.

Tuned the Informatica mappings for optimal load performance.

Used Teradata utilities fastload, multiload, t pump to load the data.

Optimized the performance of the queries running against the data mart by creation of the table partitions,

Indexes and Indexed Views.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Designing and customizing data models for Data Warehouse supporting data from multiple sources (Oracle,

DB2, Excel, Flat files).

Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and Load to STAGE

and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business

grains into the FACT tables.

Developed MDM Hub Match and Merge rules, Batch jobs and Batch groups.

Created Queries, Query Groups and packages in the MDM Hub Console.

Involved in creating Master data to populate the Dimension table attributes using the lookup

transformation.

Used DataStage Director and its run time engine to schedule running the solution, testing and debugging its

components and monitoring the resulting executable versions on scheduled basis.

Involved in import and export of the DataStage jobs by using DataStage Manager.

Used Erwin for reverse engineering to connect to the existing database and ODS to create graphical

representation in the form of entity relationships and elicit more information.

Involved in writing BTEQ scripts to transform the data.

Created Informatica mappings for initial load and daily updates.

Designed and developed Informatica mappings to Extract, Transform and Load data into target tables.

Wrote, tested and implemented Teradata Fastload, multiload and Bteq scripts, DML and DDL.

Modified mappings as per the changed business requirements.

Created Dynamic parameter files and changed Session parameters, mapping parameters, and variables

at run time.

Used Informatica to load the data to Teradata by making various connections to load and extract the data to

and from Teradata efficiently.

Extensively used almost all transformations of Informatica including Lookups, Update Strategy and others.

Developed and delivered dynamic reporting solutions using MS SQL Server 2012 Reporting Services

(SSRS).

Using SQL server reporting services (SSRS) delivered enterprise, Web-enabled reporting to create reports that

draw content from a variety of data sources.

Extensively worked on Performance Tuning of ETL Procedures and processes.

Extensively used PL/SQL programming in backend and front-end functions, procedures and packages to

implement business rules.

Tested the data and data integrity among various sources and targets. Associated with Production support

team in various performances related issues.

Developed mappings for Type 1, Type 2 & Type 3 Slowly Changing Dimension (SCD) using Informatica

Power Center.

Worked with Session Logs, Workflow Logs and Debugger for Error Handling and Troubleshooting in all

environments.

Reviewed QA Test Plans and provided technical support during QA and Stage testing (UAT).

Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica

workflows in different environments.

Environment: Informatica Power Center 8.1, Informatica Master Data Management 9.1, Informatica Hub, IBM

Web Sphere DataStage 8.0.1, Parallel Extender, Quality Stage 8.0, TOAD, Visio, Oracle 8i, Erwin r7.1, IBM

Mainframes DB2 7.0, Mercury Quality Center, SSRS, UNIX, SQL Server, Teradata R12/R13, Teradata SQL

Assistant, PL/SQL.

Client: State Farm Insurance (Chicago, Illinois) Feb 12

Mar 13

Informatica Consultant

Responsibilities:

Involved in performing a high-level risk assessments and developing migration strategies to include in

project plans.

Assisted with establishing and administering the Informatica environment.

Designed and documented all the ETL Plans.

Developed Health Care ERP solution like FACETS, QNXT and other Claim adjudication systems.

Worked closely with FACETS 4.48, 4.51 and different EDI Transaction file like 837, 834, 835, 270, 271 to

understand source structure and source data pattern.

Worked with different data sources such as DB2 tables, Flat files, CSV files and also responsible for

cleansing data in flat files.

Implemented Slowly Changing Dimensions (SCDs, Type 1, Type 2 and Type 3).

Experience with Medicare, Medicaid and commercial insurances in HIPAA ANSI X12 formats including

270/271,276/277,835,837,997 and other NSF Formats for interfaces and images to third party vendor

applications

Worked with Oracle, SQL Server and flat file sources.

Extracted Erwin physical models into repository manager using Informatica.

Involved in writing conversion scripts using PL/SQL, stored procedures, functions, packages to migrate

the data from SQL server database to Oracle database.

Studied the existing OLTP systems and created facts, dimensions and star schema representation for the

data mart using Erwin.

Used SQL*Loader to first write the target rows to a flat file and then upload the data to the tables in the

Oracle database.

Extensive testing ETL experience using Informatica 9.1/8.6.1 (Power Center/ Power Mart) (Designer,

Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects

Involved in writing Stored Procedures and calling them in Informatica Workflow Manager to drop the data

from stage tables.

Responsible for tuning the ETL mappings in order to gain high performance.

Responsible for unit testing the developed mappings and involved in UAT sessions.

Involved in migrating the plans from Development to Production and automating the process.

Environment: Informatica 8.6, Oracle 8i/9i, SQL server 2005, PL/SQL, IBM AIX, UNIX, MS Excel, Erwin4.5,

Autosys, Teradata.

Client: ICICI Prudential (Mumbai, India) July

10 Jan 12

Jr. ETL Developer

Responsibilities:

Developed ETLs to load data from Microsoft SQL Server 2000 into target Oracle Database.

Understanding the customer's requirements, analysis, design, development and implementation of system

gather and define business requirements, improve business processes .

Developed simple & complex mappings using Informatica to load dimension & fact tables as per STAR

Schema techniques.

Worked with Static, Dynamic and Persistent Cache in the Look up Transformation for the better

throughput of Sessions.

Extensively used workflow variables, mapping parameters and mapping variables.

Created Tables, Keys and Indexes in SQL Server.

Developed DDLs, DMLs, PL/SQL stored procedures, indexes for ETL operations on Oracle and SQL

Server databases.

Worked on different tasks in the Informatica workflow manager like Session, Event raise, Event wait,

Decision, E- Mail, Command, Assignment and Timer.

Developed reusable Mapplets and Transformations.

Created SQL, PL/SQL, scripts for data validation and reconciliation .

Used Procedure Transformations to invoke Oracle PL/SQL Procedures.

Used Informatica Version Control for checking in all versions of the objects used in creating the mappings,

workflows to keep track of the changes in the development, test and production.

Identified performance issues in the existing sources, targets and mappings by analyzing the data flow

evaluating transformations and tuned accordingly for better performance.

Involved in Informatica code deployments to all the environments.

Environment: Informatica Power Center 7.1.1, Erwin 4.0, PL/SQL, Windows 2000, UNIX, Oracle 9i, MS SQL

Server 2000.

Client: HCL Technologies Ltd (Hyderabad, India) Jan 09 June 10

ETL Developer

Responsibilities:

Deployed ETL Code to Production environment based on the instructions provided by development team.

Monitored the scheduled jobs in production in 24X7 shifts using Tivoli tool.

Developed Reusable Mapplets and Transformations using Informatica Power Center Designer.

Gathered the scheduled run statistics and track it for metrics purpose.

Used Informatica Debugger to debug the mappings to gain the trouble shooting information about data and

error conditions.

Fixing the failed jobs, by analyzing the issue using the session log.

Rerunning of the jobs for those that got failed in the production environment.

Use to make sure that the production is up and running and used to clean up the UNIX box that is with

unnecessary files.

Used to tune the user queries for better performance.

Have written complex SQLs by using inner, left outer and right outer joins.

Have attended daily status call and provided the status to the onsite team.

Environment: Informatica Power Center 8.1, Oracle 10g, Tivoli, Toad, UNIX, SQL Server, Putty, WinSCP.



Contact this candidate