Post Job Free
Sign in

Manager Data

Location:
Baltimore, MD, 21207
Posted:
January 05, 2018

Contact this candidate

Resume:

Summary:

*+ years of IT experience which includes 1 years of experience working as Oracle Developer and 6+ years of experience working as ETL Informatica Developer in Investment banking, banking, healthcare, telecommunication, mutual funds and retail domains.

Areas of experience include:

Strong Experience with ETL implementation using Informatica Power Center.

Hands on experience in working with Informatica Client Tools such as Repository Manager, Power Center Designer, Workflow Manager, Workflow Monitor and extracting data from different types of data sources such as oracle tables, Teradata tables,sql server tables, flat files.

Hands on experience with Admin Console.

Good Experience in creating different type of mapping for full load, delta load and CDC using different Transformations.

Experience in creating SCD type 1 and type 2.

Experience in implementing different Batch jobs and working with different type of tasks, Informatica Debugger.

Worked exclusively in creating Mapplets and Worklets to overcome complexity of mapping and reusability of Transformations with logic.

Experience in using different parameters and variables such as mapping variables, mapping parameters, session parameters, workflow parameters.

Experience in migrating code from user folder to project folders using different Techniques such as Export and Import objects and by creating Deployment Groups.

Experience in identifying different Bottle necks and performing Tuning for better performance.

Worked with different versions of Informatica power center 9.6/9.1/8.6.

Experience in writing different types of queries, complex queries, working with oracle joins,Ascii joins, oracle functions,Analytical functions, regular Expression functions, Indexes and Clusters.

Experience in writing PL/SQL procedures, Functions, Triggers, packages and exceptional handling.

Good knowledge in using different UNIX commands and writing Shell Scripts for network communication and automation of ETL process.

Experience in working with Scheduling Tools like Crontab, Informatica Scheduler.

Good knowledge on data warehouse concepts, knowledge on different type of schemas such as star and snow flake schema and knowledge on data modeling, dimensional modeling and normalization.

Good Knowledge with SQL Server database 2012/2008 R2/2005 and knowledge in using SSIS package for implementation of ETL methodology.

Experience in documenting mapping sheets, unit test cases and deployment Documents.

Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.

Worked on different domains like healthcare, banking, mutual funds. Having good knowledge on healthcare payer skills.

Exceptional ability to learn new concepts and hard working.

Ability to meet deadlines and handle multiple tasks, flexible in work schedules and possess good communication skills.

Technical Skill:

Methodologies

Agile(SCRUM),Waterfall

Databases

Oracle, MS SQL Server

ETL Tools

Informatica Power Center,SSIS

Database Languages

SQL,PL/SQL

Scripting

UNIX

Scheduling Tools

Autosys,Informatica Scheduler

Operating Systems

Windows OS,Linux

Productivity Tools

MS Word, Excel,Power Point, Outlook

Business Modeling Tools

Share point, Microsoft Office

Education:

Bachelors in Information Technology, India.

Professional Experience:

T.Rowe Price, Owings Mills MD July 2016 – till date

Position: ETL Developer

T.Rowe Price is an American Publically owned global asset management firm that offers funds, advisory services, account management and retirement plans and other services for individuals, institutions and financial intermediaries. The objective of this project is to build data warehouse to maintain retirement plans of all the customers that are coming from different clients. In this project we have done two types of stag mapping development before data is loaded into Facts and dimension. In first type of stag mapping we get everyday data from two different systems namely Omni and McCamish the form of mainframe files which are read into Informatica as data maps, data is read from data maps and loaded them into oracle database (HUB) and in second type of stage mapping data is read from DB2 table and after cleansing it is loaded into Oracle tables. Using data from these two system, data is loaded into facts and dimensions. This ETL project is developed using Informatica Power center.

Responsibilities:

Involved in extracting data from different systems like DB2, mainframe files and oracle table using Informatica Powercenter.

As per the requirement from client mentioned in mapping specification,developed different mappings for full load and increment load of data using different types of transformations like Source Qualifier, Expression, Joiner, Filter, Router, Aggregator, Connected and unconnected Lookup, Stored Procedure, to load data into different dimension and fact tables.

Worked on different types of Event Facts, Snapshot Facts and dimension tables.

Worked with network communications tools like WinSCP to create different parameter files.

Developed mapplets to populate control tables.

Created different mappings to create parameter files on server. Created command task to copy parameter files from one location to another.

Worked in creating mapping documents for the mapping developed by me.

Developed Mapplets to avoid complexity of mapping and to make use of set of transformations with logic.

Worked with session parameters for dynamic relational connections, worked with mapping parameters, mapping variables and workflow variables.

Used different logic in mappings to capture error records coming from source.

Migrated the code from user folder to Project folder.

Developed logic to perform sanity checks and truncate the tables before data is loaded into Warehouse for certain tables as required by client.

Developed mappings with different pipelines in it and using Target load order to run them as in required order.

Wrote complex Queries with different joins to get source data.

Done different types of performance tuning to the ETL code as well as to the SQl queries in order to achieve the results in less time.

Developed code for handling failure recovery, whenever the workflow is stopped due to any fatal or Informatica issues.

Performed Unit Testing to know, is expected output is achieved or not.

Developed integrated workflows for all different mappings developed in the project with different dependencies and link conditions.

Developed different task like email task, to send an email to business team if workflow is failed due to any reason.

Helped testing team in writing test cases and upload in ALM.

Developed different types of preprocess mappings to error out if we get any duplicate data from source systems after we performed different joins.

Scheduled jobs with different dependencies using Autosys the scheduling tool.

Involved in different team talks and was flexible and available when required.

Evicore Healthcare, Franklin, Tennessee May 2014 – June 2016

Position: ETL developer

Evicore healthcare was founded in 1994, with its head office in Bluffton South Carolina, having branches all over USA. It is designed to address the complexity of healthcare system, and committed to advancing medical benefits management and enabling better outcomes for patients, providers and plan. Here they need warehouse to maintain information and medical records of the patients, to improve and produce more effective healthcare system based and to do different type of analysis and decision making.

Responsibilities:

Involved in analyzing Mapping Specification to understand given details and mentioned business validations.

Involved in extraction of data (referred physician details, referred health insurance details, patient’s information, type of Tests need to be done) from various sources like flat files and SQL Server tables.

Created mapping by developing code for mentioned validations in ETL spec, by using different transformations.

Made use of transformation developer for creating reusable transformation with logic and used Mapplets for reusability purpose.

Developed complex workflows with multiple sessions with sequential and concurrent batches for mapping created using Workflow Manager.

Made use of different tasks like CMD task, Event Wait, Event Rise, Decision task, Assignment task and Scheduler to schedule the workflow as mentioned in ETL specification.

Involved in creating migration documents, KT document for production support.

Created Deployment groups for migrating code from user folder to project folder.

Involved in different team talks.

TDS Telecommunications corp, Nashville, TN Aug 2013 –May 2014

Position: ETL Developer

TDS Telecommunications is one of the seventh largest local exchange telephone companies in USA. TDS provides 1.2 million connection to Broad Band, TV entertainment services, Cellular Phones and Fixed Land Phones in more than 150 rural, suburban and metropolitan communities. This Project is all about to extract data from different source systems of customers who are with cellular and fixed phones and load into data warehouse to maintain past history of customers, to do present and future analysis, to generate monthly payment bills for customers and to know golden customer of the day.

Responsibilities:

Worked with different client tools like Power center Designer, Repository Manager, Workflow Monitor, Workflow Manager.

Extracted data from different types of source file and loaded into different dimensions and fact tables as mentioned in ETL spec.

Designed different types of mapping for CDC, delta load and made use of different type of transformations like Aggregator, Source Qualifier, Connected and Unconnected Lookup, Expression, Update Strategy Transformation, Sequence Generator, Stored Procedure and many more.

Created Mapplets to avoid complexity of mapping and reusability of set of transformation.

Created reusable tasks and created different types of batch jobs like sequential and concurrent batches.

Implemented different business validations as per the requirement and captured bad records.

Made use of UNIX network commands for file transfers.

Written UNIX shell scripts to automate execution of workflow using PMCMD command.

Involved in different types of meeting taken place from requirements gathering to deployment.

Took active part in pre-Production and identified different bottle necks and tuned mappings and sessions for better performance.

Worked with Informatica scheduler to schedule workflows.

Prepared and reviewed ETL Design Documentation, Mapping Sheet for each mapping developed.

Coordinated the support team after deployment of the project for couple of weeks and helped in knowledge transfer.

Renasant Bank, Franklin TN April 2012 – July2013

Position: ETL Developer

Renasant bank is one of the retail bank in United States of America with its head office in Tennessee and it is having branches in different locations in United States. It covers around 3 million customers. It offers different type of services like allowing to open different type of accounts, issue of credit and debit card, reward cards giving different types of loans to individual customers. This project is developed to maintain all the customer details with type of transactions the made to generate monthly bank statements and for different type of business analysis.

Responsibilities:

Involved in extraction of data from different types of sources files like csv files, Oracle tables, and Oracle views.

Worked with network communication UNIX commands for transferring files from client system to Project folder.

With information in ETL specification document, developed different types of mapping for full load, delta load using different types of Transformations like Source Qualifier, Expression, Joiner, Filter, Router, Aggregator, Connected and unconnected Lookup, Stored Procedure, to load data into different dimension tables.

Developed different types of logic to do given business validations given, by using different types of cleansing functions available in Informatica and made use of different types of function available in Informatica as per requirement.

Developed logic to for bad records and captured them into Error log tables.

Developed Mapplets to avoid complexity of mapping and to make use of set of transformations with logic.

Worked with session parameters for dynamic relational connections, worked with mapping parameters, mapping variables and workflow variables.

Created different types of reusable tasks and created different batch jobs like sequential job and parallel job based upon the dependency of session.

Made use of Informatica Scheduler for Scheduling.

Developed pre-requisite mapping, parameter mapping and complex mappings.

Migrated the code from user folder to Project folder.

Created UNIX scripts to automate, stop and abort the execution of workflow, using PMCMD command in it.

Performed Unit Testing to know, is expected output is achieved or not.

Made use of debugger if encountered any issue after creating mapping, to know flow of date from source to different transformations and finally into target.

Was involved in different types of meetings held from requirement gathering to deployment.

Involved in preparing unit test cases, mapping sheets, and migration sheets.

Involved in Pre-production and identified different types of bottle neck by studying session logs and identifying busy percentages and tuned the mappings for better performance.

Was involved in supporting the code for couple of weeks after deployment of the project and helped in knowledge transfer for support team.

During support period worked with different tickets issued and closed them in given SLAS and fix if any issues occurred.

Involved in different team talks and was flexible and available when required.

ED’S Supply Company INC, Chattanooga, TN Mar 2011 – April 2012

ED’S supply company was founded in 1957 in Nashville TN. It is primarily focused in wholesale distribution of air conditioning, heating and refrigeration parts to 30 branch locations around state of TN, GA and AR. This project is developed for decision support, maintain historic data and analyze sales across products categories, sub categories, store regions.

Responsibilities:

Involved in different types of team talks from requirement gathering to deployment, and involved in preparation of HLD and LLD.

Worked with different types of client tools like Repository Manager, Workflow Manager, Designer, and Workflow Monitor. Used Designer exclusively to extract data from different source files like XML file, Oracle tables, SQL Server tables into Source analyzer, and used mapping designer to Develop code.

Created mappings by using different types of reusable and non-reusable Transformations, mostly used transformations are Joiner, Lookup, Router, Source Qualifier, XML parser, XML Source Qualifier and Expression.

Developed slowly changing dimension Type 1 and type 2 mappings.

Made use of different type’s parameters and variables especially for reusability.

Made use pre and post SQL at session level to create and drop indexes when using target load as bulk.

When Source is Database table made of SQL override in Source Qualifier and wrote different types of SQL queries and used different types of joins and functions.

Made use of Stored Procedure Transformation to load control table.

Performed unit testing, to see whether expected output is expected or not.

Took part in different team talks and was Flexible and available when required.

Heritage Fresh, Hyderabad, India June 2010 - Jan 2011

Position: Oracle Developer

Heritage fresh is one the largest grocery retailing in southern part of India. It is a unique chain of retail outlets in India. It has total of 102 stores across south India. To perform day to day transactions they need a database. As a team member, I was involved in development of database. Database was designed in Oracle. Written different types of Queries and PL/SQL blocks for implementing business logic.

Responsibilities:

Participated in meetings and discussions with functional and Technical Leads to understand the requirements and involved in designing data model.

Involved in preparing PL/SQL requirements and design documents.

Written code in database languages, Oracle and PL/SQL involved writing Cursors, Procedures, Functions, Packages and also written different Queries and sub queries using different Oracle functions and involved in table Indexing.

Participated in code reviews and ensured the code written with the company standards and policies.

Involved in Query Optimization, debugging and performance tuning.



Contact this candidate