Resume

Sign in

Sr ETL Developer

Location:
Ellicott City, Maryland, United States
Posted:
February 23, 2018

Contact this candidate

Resume:

Prem K Peddiraju

623-***-****

ac4k7k@r.postjobfree.com

Summary:

** ***** ** ** ********** in Software Development Life Cycle (SDLC) which includes requirement gathering, designing, implementing and testing.

Over 8 years of Technical and Functional experience in Decision Support Systems - Data warehousing implementing ETL (Extract, Transform and Load) using Informatica Power Center 9.6/9.1/8.6.

Over 5 years of experience of leading multiple ETL projects of team members between 5-10 members.

Worked with Onsite and Offshore model and lead the team.

Experience in UNIX shell scripting (file validations, file downloads, workflow executions).

Good knowledge in interacting with HIVE, Informatica Data Quality(IDQ) and Data Validation Option (DVO).

Good knowledge in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.

Good knowledge in Python, MongoDB.

Have worked in multiple projects across Retail, Insurance compliance within HIPPA regulation and requirement, Investment domains at various client locations in United States America.

Experience in interacting with Business Managers, Product Owners, Analysts, and end users to correlate Business logic and specifications for ETL Development and documenting Source-to-Target Mappings, Unit Test cases and Deployment documents.

Worked on both waterfall model as well as Agile methodology projects.

Worked on tools like HPQC for defect tracking.

Involved in production support, resolving the production job failures, interacting with the operations support group for resuming the failed jobs.

Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.

Worked with the team to ensure deliverables on time.

Presented WSR/MSR reports to client for effective effort tracking.

Effectively participated in code verification.

Involved in training the trainee graduates in the technology and project perspective.

Preparing the project traceability matrix tracking the project status inline to the estimates.

Trouble shooting of long running sessions and fixing the issues.

Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Worked with reporting team to help understand them the user requirements on the reports and the measures on them. Helped them in creating canned reports.

Migrated repository objects, services and scripts from development environment to production environment. Extensive experience in troubleshooting and solving migration issues and production issues.

Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.

Modified the shell scripts as per the business requirements.

Maintained naming standards and warehouse standards for future application development.

Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions.

Preparing ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

On call Production Support.

Strong experience in data extraction and migration from legacy systems using Informatica

Good experience in CDC (Change Data Capture) and Informatica Powercenter Data Analyzer.

TECHNICAL SKILLS

Database: Oracle 10g/9i, SQL Server 2016, Snowflake

OS: UNIX, Windows 2000/NT/XP

Languages: Unix Shell scripts, HTML

ETL Tools: Informatica PowerCenter, Informatica Data Quality, Informatica Power Exchange, DVO, Informatica Cloud

BI Tools: Micro strategy

Tools/Technologies: Visual Source Safe, IIS

Web: ASP, XML, VBScript, Java Script

Design: Visio

Certification:

Agile Scrum Master

Professional Summary:

Client Name: T. Rowe Price, MD Nov '17 –Till Date

Role: Informatica Lead/Consultant

Description of the Project: T. Rowe Price is an American publicly owned investment firm. The company offers mutual funds, sub advisory services, and separate account management for individuals, institutions, retirement plans, and financial intermediaries.

This project is the first portion of the build out of the physical component of the new Distribution Data Model (“DDM”) infrastructure which will enable eventual replacement of the existing Enterprise Data Mart (“EDM”) in use.

Specifically, these initial designs build, populate effort will:

1. Design a USI DDM data physical model

2. Design the model to address two complementary requirement sets as most appropriate within immediate timeframes:

1. Address data needs for enabling transition of a series of “Interim Reports” being developed in Tableau on an Alteryx data aggregation platform and initially sourced by the existing EDM to the new DDM.

2. Incorporate other physical data structures needed to support additional potential data sources as made available during the immediate design effort, and otherwise build data structures as identified in the earlier defined USI BI Logical Data Model (with the assumption that these structures will be further defined in later design / build efforts).

Build the data model in a T.Rowe Price evaluation version of the Snowflake cloud-based datawarehouse solution.

Responsibilities:

Worked with Informatica Cloud to create Source /Target connections, monitor, and synchronize the data in SFDC.

Worked with Informatica cloud for creating source and target objects, developed source to target mappings.

Troubleshoot the existing code as well as new code as the requirement changes.

Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.

Involved in different Team review meetings.

Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.

Building extensively IDQ mappings for customer data cleansing and exporting to PowerCenter and using as normal transformations in building the logic.

Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.

Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems

Creation of ETL frame work and review of high level design document

Extensively used Informatica debugger to validate mappings and to gain troubleshooting information about data and error conditions.

Created detail ETL Migration Processes for Informatica, Database, Scheduling, O/S and H/W teams.

Created Data Profiling for the source mappings for trapping the data related issues.

Environment: Informatica 9.6, IDQ, Informatica Cloud, Snowflake, Unix script, MS SQL server 2012, Flatfiles.

Client Name: The Renait Thomas Corporation, AZ Jun '15 –Oct ‘17

Role: Technical Project Manager/Informatica Lead

Description of the Project: The aim of this project is to convert customer vehicle data information from old fitment guide to new fitment guide. The customer who enters to the POS will provide his vehicle make, model and year and style of this vehicle and internally web method call informatica webservice to provide his customer information along with the dimensions of the vehicle if he is an existing customer else it would provide a new customer id and same dimension details. We are even using Address doctor to valid his address and suggest nearby stores he can visit to change his tires. We have a product review (Tires) catalog where we will be providing with suggested list of tires that fit his vehicle and its rating. If he purchases online we have built a shipping block details that sends him with tracking information. For Customer and Vehicle Cleansing we are using data quality mappings which are scheduled to run on daily basis. We are using SAP IDOC and BAPI’s to load data to CAR (Customer Access Repository).

Responsibilities:

Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.

Extensively using Informatica client/Power exchange tools to extract data from different sources like SAP, SAP IDOC, SQLSERVER, AS/400(Mainframes).

Troubleshoot the existing code as well as new code as the requirement changes.

Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.

Involved in different Team review meetings.

Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.

Building extensively IDQ mappings for customer and vehicle data cleansing and exporting to Powercenter and using as normal transformations in building the logic.

Used Informatica 9.6 to load data into SAP CAR system using SAP IDOC and BAPI’s.

Configured Address Doctor which can cleanse whole world address Data and enhanced by making some modifications during installation.

Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.

Involved in technical supporting, troubleshooting data warehouse application to meet Service Level Agreement(SLA's)

Handling of CR's (Change Requests) and enhancements for existing application and followed change management process

Distribution of work to onsite and offshore team and follow-ups.

Work in close co-ordination with the SME's to reverse engineer database stored procedures to convert the encapsulated business logic into Informatica mappings.

Managed risk, escalated issues as needed and communicated project status to Client Services and Technical Implementation teams including providing reports as requested

Worked with technical managers to define and schedule resources required for implementations, upgrades and other projects as needed

Escalated as needed to resolve issues/ roadblocks thus ensuring project deliverables are met in a timely fashion

Produced monthly project forecast to ensure defined scope was delivered and managed to timelines and budget; communicated project status to management and project stake holders.

Led the team to utilize the Agile process to improve project quality and schedule.

Environment: Informatica 9.6, IDQ, MDM, MS SQL server 2012, MainFrame, Flatfiles.

Client Name: Toyota Motor Sales, CA Jan '15 –May ‘15

Role: Informatica Lead/ Data Analyst

Description of the Project: C360 NextGen is a project extended from existing Endeca based C360. The objective is to build Big Data platform that provides 360-degree view of data sourced from various systems including unstructured data from social media. 15 systems have been identified as source for this project. Hive /Impala will be used for data storage, and Parquet is used for columnar data compression. The aim of this project exercise is to understand the data anomalies from various source systems with business users from all Toyota brands and various functional departments, such as Public Relations, Marketing, and Customer Relations and provide data to C360 Portal.

Responsibilities:

Reading through the data and present the findings, and translate them into an understandable document.

Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.

Presented Data Cleansing Results and IDQ plans results to the OpCos SMEs.

Worked with Informatica to load data into various target tables after performing the data quality.

Documented Cleansing Rules discovered from data cleansing and profiling.

Involved in complete understanding of business requirement and involved in analyzing the sources to load in Oracle warehouse.

Designed Sources to Targets mappings from SQL Server, Excel/Flat files to Oracle using Informatica Power Center.

Responsible for working with DBA and Data Modellers for project coordination.

Responsible for Detail design of each interface.

Responsible for developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations for Large volumes of Data.

Developed SDE mappings to load large amount of data from different sources to staging area, SID mappings for load the data from staging area to target system.

Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.

Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.

Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and performance related issues.

Created various tasks like Event wait, Event Raise and E-mail etc.

Created Shell scripts for Automating of Worklets, batch processes and Session schedule using PMCMD

Tuning the Informatica mappings to increase the performance.

Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and User testing. Also provided production support by monitoring the processes running daily.

Created Datamaps using Power Exchange to extract the data from mainframes using copybooks.

Environment: Informatica 9.1, IDQ, Oracle 11g, Data quality, Toad, PL/SQL (Stored Procedures, Packages), Erwin, MainFrame, Flatfiles.

Allstate Investment, IL Mar '13 -Dec ' 14

Role: Sr. Developer

Description of the Project: Cross Asset Data Warehouse contains information gathered from multiple source systems into a central data store for various investments such as: Taxable Fixed Income, Equities, Derivatives, and Mortgage Loans. XA is the data warehouse built with Oracle as database, on UNIX server and Informatica as ETL tool. All the Informatica jobs are triggered via UNIX and Tivoli. It is used for reporting purposes and sometimes to disseminate data to downstream systems which run their own analytical processes to come up with reports: There are about 60-70 users using 200 reports from BO and 2200 from Micro Strategy (the AdHoc reporting runs into thousands). But the IT generated reports (on a daily, weekly and monthly) are probably around 600-700.

Responsibilities:

Responsible for requirement gathering, data analysis, design, development, testing and systems implementation.

Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.

Extensively using Informatica client tools to extract data from different sources like flat files and oracle.

Troubleshoot the existing code as well as new code as the requirement changes.

Performed Unit Testing and Created UNIX Shell scripts and provided on-call support during the off hours.

Did overnight schedule monitoring during critical month-end/quarter-end positions, source system upgrades and during integration/system tests.

Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.

Involved in different Team review meetings.

Generated weekly and monthly report Status for the number of incidents handled by the support team.

Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.

Verify the data quality or data validation through Informatica data validation option tool (DVO) and Data Quality (IDQ).

Environment: Informatica 9.1, Oracle, DVO, Toad, PL/SQL MS Visio, Windows 2000, UNIX, MainFrames.

Allstate Insurance –Encompass Jan '11 - Feb '13

Role: Sr. Developer

Description of the Project: Allstate sells Encompass Insurance branded property and casualty products exclusively through Independent Agents. In the current environment, the independent agent facing reports are obtained from a variety of source data stores. Among them, Independent Agent Enterprise Data Warehouse (IAEDW), Actuarial Data Mart (ACTM) and Agency Data Mart (ADM) are the primary sources for most of the information. There are also significant, localized reports maintained at the business unit level. Thus exists a disparate reporting environment supported by a variety of underlying technologies, leading to multiple versions of truth. The recently concluded analytics and reporting strategy and data design project at Encompass had laid out a roadmap to consolidate the reporting environment and provide access improvement. The scope of this project will be to Build & Implement the Agency Data Mart.

Responsibilities:

Provide profiling, cleansing and validation reports in preparation for a data migration.

Responsible for requirement gathering, design, development, reviewing, testing and systems implementation.

Leading the team in coding, testing and code migration.

Involved in migrating Informatica ETL application and Database objects through various environments such as Development, Testing, UAT and Production environments.

Assisted change management reviews and impact analysis with the Business team.

Identify the stake holders, priorities the task items which could be delivered in the next cycle as per part of engagement.

Involved on various HIPPA claims validation and verification process to understand the source data.

Worked with the healthcare data in HIPPA ANSI X12,837,834,835,277,271,270.

Environment: Informatica 9.1, IDQ, SQL, Unix Scripting, Flat Files, Windows NT/2000.

American Express (Amex), AZ Aug '10 - Dec '10

Role: Informatica Sr. Developer

Description of the Project: This project was to migrate old legacy FIRS system to new IRIS system and decommission the FIRS and data would be feed to new key reports will be generated to support R&C activities for those accounts.

Responsibilities:

Responsible for Design and Development of ETL and the reporting application.

Communicated with the users to observe various business rules in implementing the data warehouse.

Modified existing mappings for enhancements of new business requirements.

Check in/Checkout the code, Test Plan, Test cases and unit test case.

Creation, configuring, scheduling and running of Sessions, Worklets and Workflows.

Preparing the unit test cases and perform Unit testing and peer reviews at the end of each phase.

Trouble shooting and debugging the issues evolved during the project execution by providing the root cause analysis.

Coordinating with the various levels of production support teams to provide quick analysis and problem resolution through data/code fixes.

Environment: Informatica 8.6, VBSCRIPT, Oracle9i, SQL, PL/SQL.

Khols, CA Oct '09 - July '10

Role: Informatica Sr. Developer

Description of the Project: This project was to migrate old legacy (PL-SQL) code to new Informatica code to generate 29 incremental extracts (daily/weekly) mainly for order management/commerce related data from newly build ATG-10 and Sterling database for third party of the client for loading into their e-commerce data marts. Since source system building was also in progress parallel hence it made it more challenging with day to day requirement changes. Source was Oracle11g and target as flat files.

Responsibilities:

Responsible for Design and Development of ETL and the reporting application.

Modified existing mappings for enhancements of new business requirements.

Check in/Checkout the code, Test Plan, Test cases and unit test case.

Creation, configuring, scheduling and running of Sessions, Worklets and Workflows.

Modified existing UNIX Shell Scripting and created new scripts.

Preparing the unit test cases and perform Unit testing and peer reviews at the end of each phase.

Trouble shooting and debugging the issues evolved during the project execution by providing the root cause analysis.

Environment: Informatica 8.6, SQL Server, UNIX script

Merrill Lynch, GA Nov '08 - Sep '09

Role: Informatica Developer

Description of the Project: Merrill Lynch is one of the world's leading financial management and advisory companies, with offices in 36 countries. Through Merrill Lynch Investment Managers, the company is one of the world's largest managers of financial assets. This warehouse project is to assist Financial Advisory Center to categorize their customers based on portfolio of services including various types of accounts, personal loans and geographical area. Using different ad hoc analysis, warehouse is supposed to assist in defining strategy for each customer category; it also designed to provide detailed reporting on each lead/advisor/team, as well as, collects and reports on performance metrics. Project was executed in phases building up from extraction, computation logic, dimension models and populating data using Informatica.

Responsibilities:

Analyse the specifications and identify source data, which needs to be moved to data warehouse.

Participated in the Design Team and user requirement gathering meetings.

Implemented Slowly Changing Dimension type2 methodology for accessing the full history of accounts and transaction information.

Worked on Power Center client tools like Source Analyser, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.

Used Debugger to check the data flow in the mapping and made appropriate changes in the mappings to generate the required results.

Developed mappings to load data in slowly changing dimensions.

Created Sessions in workflow manager and monitored it using workflow monitor.

Environment: Informatica 8.6, Oracle 9i, UNIX, Sybase, Windows 2000.

Client: GE Healthcare, Sept '06 - Sept '08

Role: Informatica Developer & Production support

Description of the Project: GE Health Care is a member of the General Electric group of companies GEHC is the world’s leading manufacturers of medical diagnostic imaging equipment, including conventional and digital x-ray, magnetic resonance, ultrasounds, nuclear medicine and other related areas. The business interests of the company are widely dispersed with clients across the globe. This makes it necessary for the company to maintain Information Systems, which provides reliable service to its clients with optimal response time and efficiency.

Responsibilities:

Designed ETL mapping based on Existing ETL logic.

Involved in the development of Informatica mappings.

Resolving the Tickets and build UNIX Scripts for FTPing Files.

Problem Resolution Document Preparation.

Provide 3rd and 4th line support for the Enterprise Data Warehouse.

Environment: Informatica 8.1, Oracle 9i, UNIX Scripts.



Contact this candidate