Post Job Free
Sign in

Data Developer

Location:
Cana, VA, 24317
Posted:
June 20, 2016

Contact this candidate

Resume:

Email_id: ***********@*****.***

Mb:+1-919-***-****

Pradeep Muppalla

Senior ETL Conversion Developer

SUMMARY

Around 10 plus years of IT experience in Data warehouse/ Data Integration using Informatica.

Strong experience in Data Warehousing, Data Analysis, ETL, reporting tools, testing methodologies, development, Maintenance and Documentation.

Strong troubleshooting skills, strong experience of working with complex ETL in a data warehouse environment.

Extensively worked with Informatica Tools – Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation Developer and Informatica Repository Manager.

Extensively used Informatica Power Exchange and Power Connect for extracting and loading data between legacy systems and Informatica Power Center.

Experience in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema, which are used in relational, dimensional and multidimensional modeling.

Experienced in loading data into Data Warehouse/Data Marts using Informatica, Oracle Scripts.

Extensive knowledge on project management methodologies, software development life cycle methodologies.

Involved in Life cycle of projects such as requirements analysis, Design, Testing and Implementation Phases.

Expertise in designing, development, requirements gathering, system analysis, technical documentation & flow charts, team management, test & data management, client relationship and product delivery

Extensively used Erwin to design Logical/Physical Data Models, forward/reverse engineering, modeled the Data Warehousing Data marts using Star Schema and published data model files

Complete knowledge of migration process from DEV to QA, QA to PROD environments.

Created TRD’s, TDD’s, test scripts for Unit testing, Integration Testing, and User acceptance testing (UAT). Created model diagrams using – Visio.

Worked extensively on the performance tuning methods. Used OLAP function, query optimization techniques and analyzed performance bottlenecks using the explain plans.

Involved in Performance tuning of Informatica mappings and sessions.

Extensively used Slowly Changing Dimension (SCD) technique for incremental updating of the targets.

Experience in Agile Methodology SDLC

Strong technical skills in SQL, PL/SQL.

Experience in UNIX shell scripting.

Involved in writing shell scripts in UNIX for Informatica ETL tool to run the sessions

Experienced in coordinating, communicating with outside vendors and off-shore resources in support of time line and IT project deliverables and has exposure to multiple functional domains

Good exposure to the process and quality systems and Good exposure to work and co-ordinate with multiple teams

Experience in Data Masking Transformation and Java Transformation

Set up Managed File Transfer (MFT) to securely send the data files

Have experience of working on Source Control Tools like Clear Quest and working Release Management team with Work Order end – end process.

Prepare PA, Class D, Class C, and Class B Estimates for various projects.

Handle some of the Informatica Administrator activities like creating deployments based on label and creating folder, granting permissions

Ground Knowledge on Informatica MDM and Data hub.

Experience in Connecting to JAVA through Informatica via JMS messages.

Certified Informatica Developer

Education:

B.E (bachelor of Engineering) in Computer Science from Visvesvariah technology University,Karnataka

Technical Summary:

Data Warehousing

Informatica Powercenter 9.1.1/8.6/8.1/8.0/7.1.2/7.1/7.0, Informatica PowerMart, Informatica Data Quality8.6/9/1/0, Address Doctor 5.8,Web services Informatica Powerconnect for Peoplesoft/Siebel/DB2/SAP, IMS Data (Rx Data, Plantrak, Xponent Plantrak), Syncsort, Trillium 7.6/6.0/5.0 (Converter, Parser, Geocoder, Matcher), Firstlogic, Autosys, Control M.

Data Modeling

Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Schema, Modeling, FACT and Dimension Tables, Physical and Logical Data Modeling, Erwin 7.3/7.1/4.1/4.0, Oracle Designer

BI

Business Objects XI/6.5/6.0/5.1/5.0 (Web-Intelligence 2.5, Designer 5.0, and Developer Suite & Set Analyzer 2.0), Cognos Series 7.0/6.0/5.x, Cognos Impromptu, Cognos IWR (Impromptu Web Reports), Cognos Power Play Transformer, SAS, Developer 2000, Crystal Reports

Others

Unix Shell Scripting, SQL, Java, HTML, PL/SQL, SQL Plus, C, Cold Fusion, PERL

Databases

Oracle 11G/10G/9i/8i/8.0/7.x, DB2 8.0/7.0, Teradata V2R5/V2R4, IBM DB2 UDB, MS SQL Server 6.5/7.0/2000, MS Access 7.0/’97

Environment

Sun Solaris, Sun OS, HP-UX, Novell 3.11, MS-DOS 6.22, Windows 2008/2005/2003/NT/XP, Windows 7, IBM-Compatibles, HP9000

Professional Summary:

Credit Suisse,Morrisville,NC Mar’14 – Till Date

Senior ETL/Database developer

Credit Suisse has launched the IB Client Reference Data program to deliver a Client Reference Data Service across the IB to source, validate, cleanse, control and distribute quality client and related data to consumers. Core components of the Reference Data Solution include front-to-back workflow, client data hub and distribution capabilities. The distribution work stream will create a centrally managed data distribution service for multiple investment banking (IB) consumers and will support the on-going strategic directive of the Client Reference Data program. The initiative will provide client data to and from the operations processing platforms (OPP) and other business systems that require reference data information using a variety of transport mechanisms.

Responsibilities:

Gathering the Business Requirements and preparing the Technical specification.

Checked for Data replicated to tables in the databases on a regular check basis.

Created Oracle Triggers to load incremental data to staging tables and Audit tables.

Created Oracle views to read data from stage tables and maintain flags for Inserts, Update and Delete data

Developed the mappings, workflows to read data from the views to load incremental data to CCR tables and CCR_AUDIT tables.

Responsible for the code to be moved to UAT and PRODUCTION as per releases.

Applied labels in Informatica to move from one environment to other environment.

Load the CCR data in a cyclic batch scheduled in Control M tool.

Implemented Triggers.PL/SQL Procedures, Functions and encapsulated them into Packages to process the batch for every run and maintain the run_id’s, updated the run_id’s and process the count, for all the stage tables.

Created Unix Shell Scripts to run these feed_id’s in parallel as workflows and automate this job run.

Monitored the jobs run in the Control M Application Web doc Portal.

Created materialized views for the downstream systems which is refreshed daily for the data availability to downstream applications

Integrated with different downstream applications to process data as messages through Queues in Informatica.

Created wrapper scripts in Unix to call multiple scripts and make them run cyclic or run frequently in Unix

Production support for the application in resolving issues.

Environment:

Informatica Power Center 8.6/9.1.1/9.1.5, Oracle, 11 g, PL/SQL, Toad for oracle 10.6,Control-M 7,Unix, Windows 7,putty,SSh Tectia,SQL developer

Horizon BCBSNJ, Newark, NJ July’11 – Feb 2014

Senior ETL Conversion Developer

Horizon Blue Cross Blue Shield of New Jersey is New Jersey's largest health insurance company, serving over 3.6 million members. Horizon BCBSNJ is a not-for-profit, tax-paying health insurer, headquartered in Newark, New Jersey. ICoE is a centralized area within the Horizon account whose purpose is to provide development services for all Informatica projects at Horizon

Project Title: 1

Horizon BCBSNJ - Address Validation web Service calls

Invoke the Informatica web service passing the address data elements captured from the AddressValidationZipLookUp service. AddressValidationZipLookUp service calls either one of the Informatica services and the logic to choose one of the two Informatica services is based on the input request elements passed by clients. Created 3 address validation web services 1) Suggested list 2) Batch mode 3) Ziplookup service.

Responsibilities:

Interact with business team to understand business needs and to gather requirements.

Prepare requirements document in order to achieve business goals and to meet end user expectations.

Create WSDL Source and Target definition with address parameters

Created Informatica Data Quality 3 mapplets for 3 webservice calls using address validation transformation which returns address validation status along with Geo-return codes, longitude & latitude.

Extensively worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplet designer, Transformation Developer.

Performed Standardization process through Informatica data Quality worked with components

such as DB Source and to Upper and Search&Replace and Word Manager and CSV Target

Create Mappings and extensively used transformations like Source Qualifier, Filter, Lookup, Expression, Router, Aggregator and Sequence Generator.

Extensively involved in tuning the mappings, sessions and the Source Qualifier query.

Identify performance issues in calling webservices on batch mode with multiple address and were able to resolve by increasing idle time

Provided extensive support to teams who are using webservices calls and differentiated addresses between USPS and Address doctor. Any discrepancy was reported to Informatica support and were able to get fix in next address doctor updates.

Environment: Informatica Power Center 9.1.1, Informatica Data Quality 9.1.0, Address doctor 5.8, Web Services, SQL Server 2008, WSDL Definitions, SOAP Requests and response

Project Title: 2

Pharmacy Inbound – Medco (State of NJ, NJTransit, Suburban Propane) / City of Clifton (benecard) / Caremark (Pfizer, Wyeth)

Horizon is engaged in a strategic effort to expand its capability to acquire data from third party Pharmacy Benefit Managers (PBM). Presently due to the current technology (Micro Focus COBOL), volume and complicated ETL process, some jobs run long hours. The daily (two) feed runs for one - three hour and the bi- monthly (four) feed on 1st and 16th of every month runs for forty four hours. There is a need to accelerate the run time to save time and increase efficiency.

Responsibilities:

Interact with business team to understand business needs and to gather requirements.

Prepare requirements document in order to achieve business goals and to meet end user expectations.

Create Mapping document from Source to ODS, ODS- UPH and UPH -IHS mapping.

Create new reusable mapplets and reusable CBR mappings to use across all the Pharmacy feeds

Extensively worked with Designer tools like Source Analyzer, Target designer, Mapping designer, Mapplet designer, Transformation Developer.

Performed Standardization process through Informatica data Quality worked with components

such as DB Source and to Upper and Search&Replace and Word Manager and CSV Target

Created PowerExchange Data Maps to connect to Mainframes Source files.

Design Mappings to extract incremental data based on Systemmodstamp methodology

Design Mappings by including the logic of restart and LAV logic implementation

Create source and Target Definitions, Reusable transformations, mapplets and worklets.

Create Mappings and extensively used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator and Sequence Generator.

Extensively involved in tuning the mappings, sessions and the Source Qualifier query.

Identify performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.

Manage all technical aspects of the ETL mapping process with other team members.

Actively participate in Scrum Meetings.

Experienced in writing complex queries.

Preparation of Scheduling Requests, Deployment Guide, ETL Runbook and Error Handling

Set up Managed File Transfer (MFT) to securely send the data files

Deploying the code into production and provide support until solution stabilizes in production.

Environment: Informatica Power Center 9.1.1, TOAD, Oracle 11g, PL/SQL, SQL Server 2008, DB2, Tivoli, UNIX, Unix Shell Scripts IDQ, Power Exchange 9.1.0

Project Title: 3

Horizon Operation System (HOS) Wave 7 Projects - MA Palmetto, SNP Encounter, LuminX

Horizon Blue Cross Blue Shield of New Jersey (HBCBSNJ) is required by the Centers for Medicare & Medicaid Services (CMS) to replace the Risk Adjustment Processing System (RAPS) model of data collection with the Data model.

Responsibilities:

Worked with SME’s and requirement gathering team to gather the Requirements & Specifications about the Client’s Insurance Claims, Benefits and Polices data.

Involved in developing High Level Design and Low Level Design Documents.

Defined and implemented the required business transformation rules and logic (ETL development).

Developed mappings to move data from legacy systems to Federated Data Warehouse.

Implemented ETL Balancing Process to compare and balance data directly from source and warehouse tables for reconciliation of data.

Worked with Open connect sources like Web services and EDI

Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into the control table, generate parameter files etc..

Created Expression Data Masking to mask SSN and Phone Number.

Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.

Designed and executed test scripts to validate end-to-end business scenarios.

Tuned mappings, procedures and scripts when doing complex data loads for better performance.

Experienced in identifying and documenting data integration issues, challenges such as duplicate data, non-conformed data, and unclean data

Created session tasks and managed database connections and scheduled workflows.

Worked closely with Validation team to resolve mapping defects.

Worked with Control-m tool to run the informatica jobs.

Developed UNIX scripts as part of automation.

Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.

Resolved complex technical and functional issues/ bugs identified during implementation, Testing and post production.

Involved in lot of mock runs before the production runs.

Created exception tables to capture the exception records and resolved the exception issues.

Conducted thorough unit testing and assisted team with functional and technical testing.

Involved in migrating millions of claim records through multiple production runs.

Solely responsible for development and test environment runs.

Optimizing ETL performance.

Coordinated with offshore team for DMCM’s (Data Model Change Management) and DMCR’s (Data Model Change Request) implementation.

Worked on Data cleansing and data profiling tools like IDE and data analysis tools like IDQ

Environment: Informatica Power Center 9.1.1, DB2, Oracle 10G, TOAD 8.6.1, PL/SQL, SQL server 2005.Group1 Enter prize Designer, Power exchange

Project Title: 4

Dental Migration

Currently all Horizon dental data is processed by the system called Latron. Horizon's Dental data will be processed by Third party administrator DeCare (Commercial) as Latron is going to sunset after 12/31/2012. Once the migration from Latron to DeCare is completed, data will be sent from DeCare to Horizon Enterprise Data Warehouse (EDW) ODS. DeCare is planning to send Claims, enrollment, provider and capitation data to Horizon Enterprise Data Warehouse (EDW) Dental Operational Data Store (ODS). The reports and extracts that are generated out of Latron as of today for internal/external business users will be generated from Horizon Dental ODS going forward.

The creation of EDW Dental ODS will follow Master Data Management architecture from data model perspective and Extract Transform and Load (ETL) framework for Extraction, Transformation and Load of data.

ETL framework will have 4 layers for data load: Staging, Common Data Type (CDT), Common Record (CRF) and Target Layer. All the extracts are required to be generated out of Target Layer

Responsibilities:

Worked with SME’s and requirement gathering team to gather the Requirements & Specifications for delivering the solution to the NMS and NASCO outbound enrollment module

Translated business requirements to technical requirements and design documents.

Involved in developing High Level Design and Low Level Design Documents.

Create Mapping document from staging, CDT,CFR, Target mapping.

Worked on transformations like SQL Transform Transformation and Dynamic Cache lookup.

Defined and implemented the required business transformation rules and logic (ETL development).

Developed mappings to move data from legacy systems to Federated Data Warehouse.

Implemented ETL Balancing Process to compare and balance data directly from source and warehouse tables for reconciliation of data.

Performed Column Profiling in Informatica data Explorer

Worked with Open connect sources like Web services and EDI

Created and worked with active lookup transformation.

Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.

Designed and executed test scripts to validate end-to-end business scenarios.

Tuned mappings, procedures and scripts when doing complex data loads for better performance.

Experienced in identifying and documenting data integration issues, challenges such as duplicate data, non-conformed data, and unclean data

Created session tasks and managed database connections and scheduled workflows.

Worked closely with Validation team to resolve mapping defects.

Worked with Tivoli tool to run the informatica jobs.

Developed UNIX scripts as part of automation.

Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.

Resolved complex technical and functional issues/ bugs identified during implementation, Testing and post production.

Involved in lot of mock runs before the production runs.

Created exception tables to capture the exception records and resolved the exception issues.

Conducted thorough unit testing and assisted team with functional and technical testing.

Solely responsible for development and SQA support.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica PowerCenter 9.1.1, DB2, Oracle 10g, UNIX, Tivoli, MFT

WellPoint – EPDS, CT Feb ’10 - May‘11

Senior ETL Conversion Developer

EPDS is an initiative from WellPoint to centralize all provider data that resides in their different territories. It is a system that helps to better manage provider databases. It enables the entry of provider data once into a central repository, thus streamlining and standardizing WellPoint’s business processes. EPDS is divided logically into EPDS front end (Java), middle tier (Informatica) and backend (Mainframe) processes.

EPDS is a web-based application that enables PDO users to store, view, and maintain provider information. By providing a single, standard mechanism for managing and accessing WellPoint’s provider information, EPDS promotes many direct benefits like easy navigation, better data management, less claim rework etc

SCOPE:

The Current scope of the Enterprise Provider Database Solution (EPDS) project is to capture the Provider information into central repository and also streamlining and standardizing Provider information at WellPoint business processes.

Responsible in delivering solution to our customer:

Responsible to interact with onshore people to get the business requirement.

Responsible to develop ETL (Extract Transform Loading) mappings from STD_TRN_PRE_EPDS stages

Create mappings with various transformation logics.

Developed Java transformation as a part Business rules Validation.

Daily Coordination with Onsite- Offshore team to develop source to target mappings and identify transformation rules.

Worked on transformations like SQL Transform Transformation and Dynamic Cache lookup.

Implementing Performance tuning techniques.

Involved in Unit Testing and prepared documents for Test Results and Review Logs.

Develop Unix scripts for scheduling workflows.

Monitor Jobs

Environment: Informatica Power Center 8.6, DB2, PL/SQL, BO, Control-M, UNIX, IBM Optim, Mainframes, Windows NT

Workers Compensation- Insurance, UK Oct’08 – Jan’10

Informatica Developer

Ovation Claim (V7.3.1) is a claims management application developed by Cambridge Solutions to enable division to claim processing for workers compensation in line with the US regulation and processing requirement for USA.This Product is currently being enhanced to suit similar requirements for Australia, for our Australian SBU ‘Cambridge Australia.

Cambridge Australia currently uses an application called ASWIG to cater their current business needs. ASWIG is a product, which is developed and supported by from employers Mutual. ASWIG consists of two applications ‘Underwriting’ and ‘Claims”. the data for both these application is stored in a common SQL server database. The same need to be migrated to the newly built Ovation data warehouse. From this data warehouse we are designing Reports.

Responsible to interact with the business analyst team to get the business requirement

Responsible for Loading CSV files into Oracle Tables using Mappings.

Worked on Informatica partitioning for Flat files.

Developed some shell scripts while using Flat files.

Generated many mappings with the most suitable transformations and created the mappings for single time dimensional loading through Flat files.

Responsible for the new CR’s generated on migration module.

Used various transformations Like Connected and Unconnected Stored Procedure, Connected and Unconnected Lookup’s

Worked on Session Properties like Constraint bases Loading, Session Parameters.

Worked on Lookup Override to join more than 2 tables for tuning of output.

Review of Mappings and Procedures logic and to alter mappings according to CR’s.

Created few Reports using Cognos for CR’s on Mater- Detail reports.

Environment: Informatica8.1.1, Cognos8, Oracle 9i, Sql server

AUTOPPK, Singapore Sep’07 – Oct’08

Informatica Developer

The objective of the Autoppk project is, to migrate the existing Autoppk Database and Application servers to the DCC Oracle PROD servers and calculate PPK. The data bases are having only the HP Inkjet Printers manufacturing data. The entire dts package and Hobbescpk procedures logic are being migrated to informatica mappings. Designing one Detail Table and one summary table to replace the individual tables in different DB instances like Hobbes base, Laserdb and Tarzan base.

Autoppk Calculation page from ASP to be replaced by BO Reports:

All the existing reports to be migrated to BOSS Environment in which database will now be linked to BOSS Universe

Responsible to interact with the business analyst team to get the business requirement

Review of DTS packages and Procedures logic and prepare design document

Generated many mappings with the most suitable transformations and created the mappings for single time dimensional loading.

Created mappings for loading detail table from three Databases (Oracle, Sqlserver, Informix) and implemented CDC logic with using mapping variable.

Created mapplet for three instance databases and reusable unconnected lookup.

Created mapping for loading Fact table to calculate process performance Indicator

Created Universe in BOXI and resolved loops by creating context and alias tables.

Generated Summary and Detail Reports in BOSS Environment.

Created Hyperlink to navigate from Summary to Detail.

Created Cascading Prompts at the time designing Universe.

Environment: Informatica7.1.1, BOXI, Oracle 9i, Sql server, Informix, Autosys

Human Capital Management System, CT May’05 – Aug’07

Informatica Developer

This system is automated and tells about the HRO (Human Resources Operation) of its various clients like Mellon, Exxon, Amex, Tenneco, etc. It is mainly divided into four modules like H&W Others, HR Ongoing, Pension Payroll ongoing and ANSI. The Process is all about the implementation, maintenance and execution. The maintenance and the development work are equally balanced.

Responsible to interact with the business analyst team to get the business requirement.

Review the Business Requirement Document and will share the work with the team developers based on the complexities.

Generated many mappings with the most suitable transformations and mapplets and also taken necessary care for the performance.

Worked on Informatica Power Center tools Mapping designer, Workflow manager, Repository Manager and workflow monitor.

Involved in designing the mapping with an expression and filter transformation to perform data cleansing and scrubbing.

Developed Transformation logic and designed various complex Mappings in the Designer and tuned them for better performance.

Involved in design the mapping maintains a full history using SCD Type 2.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router, Aggregator and LOOKUP to mappings in the Informatica Power Center Designer.

Environment: Informatica7.1.1, Oracle 9i, Windows XP



Contact this candidate