Post Job Free

Resume

Sign in

Data Developer

Location:
North Chicago, IL
Posted:
April 05, 2021

Contact this candidate

Resume:

SKILLS SUMMARY:

Having **+ years of experience in design and development of data warehouse using ETL tools Informatica, IICS (Informatica Intelligent Cloud Service) & Teradata.

Good hands on experience in Salesforce integration using Informatica advanced transformations like Salesforce Lookup, Salesforce Merge.

Good hands on experience in creating Informatica Web services using advanced transformations like Web Service Consumer, SQL Transformation, Advanced Multi Group Source Qualifier, advanced XML transformations.

Extensive experience in creating SOLR Indexes which is used as SOLR Web Service to Integrate with Tibco APIGEE URLs projected as Website search option through API call.

Good hands on Experience in Reltio MDM in loading and mastering Customer and Accounts data by applying match and merge rules.

Good hands on experience in writing Teradata BTEQ scripts.

Expertise in building end to end Customer to Territory Assignment & Rep User to Territory Alignment solution using JAVELIN tool.

Extensive experience in writing UNIX Shell & PERL Scripts and automation of the ETL processes using UNIX shell scripting and writing shell script to send automated exception emails.

Good experience working in Life Science and Financial domain projects.

Extensive experience in Informatica Power Center Repository Manager, Power center Designer (Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer), Workflow Manager (Task Developer, worklet designer, workflow designer) and workflow monitor.

Involved in complete Software Development Life Cycle (SDLC) of project with experience in domains like Life Science & Financial.

Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, and Data Validation.

Experience in Data Warehouse/Data Mart, OLTP and OLAP implementations teamed with project scope, requirements gathering, Analysis, Data Modeling, Effort Estimation, ETL Design, Development, Unit Testing, System testing, Implementation and production support.

Strong experience in Dimensional Modeling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using erwin and TOAD.

Expertise in working with various data sources such as Salesforce, Oracle, Teradata, Excel, Flat files, and XML files.

Expert in writing SQL queries using SQL, PL/SQL, Procedures/Functions, and Triggers.

Experience in understanding High level design documents and creating Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Experience in preparing Technical design document, Source to Target Data Mapping document, DDD (Detailed Design Definition), UTD (Unit Testing Document), SIQ (Software Installation Qualification), OPS Guide & Schedule-18 documents.

Experience in closely working with the business teams and gathering the requirements.

Involved in testing debugging, bugs fixing and documentation of the system. Participated in design and code reviews and verified compliance with the projects plan.

Coordination with business users for user acceptance testing, sign-offs and implementation.

CERTIFICATIONS:

Informatica 8.6 Certified Developer.

Teradata V2R5 Certified developer.

TECHNICAL EXPERTISE:

Data Warehousing

Informatica Power Center 10.x/9.x/8.x/7.x, IICS (Informatica Cloud Service), Teradata, Informatica Web Services and SOLR Apache Web services and Reltio MDM, Snowflake.

CRM

Salesforce (SFDC)

Data Modeling

erwin, TOAD

Databases

Teradata, Oracle12.7/11i/10g/9i/8i

Languages

SQL, PL/SQL, Unix Shell Script, PERL Scripting

GUI Tools

TOAD, SQL Developer, Teradata SQL Assistance

Operating systems

Windows & UNIX

Reporting Tools

QlikView & BODI

Scheduling Tools

Autosys, UC4

Versioning

MKS

Education Summary:

Master of Technology (M.Tech) from National Institute of Technology(NIT), Warangal, India in the year 2004.

Professional Summary:

Name of the Employer

Designation

From

To

Cognizant Technology Solutions US Corp

Sr Associate

05/09/2016

till date

Cognizant Technology Solutions India

Sr Associate

10/04/2010

05/06/2016

NESS Technologies

Software Engineer

11/03/2008

06/25/2010

Merrill Lynch India

(Bank of America)

Associate Consultant

10/29/2007

09/05/2008

Professional Experience:

AbbVie Oct’10 – till date

North Chicago, IL

Sr ETL Developer

Client Profile:

Abbvie is a global, broad-based health care company devoted to discovering new medicines, new technologies and new ways to manage health. Our products span the continuum of care, from nutritional products and laboratory diagnostics through medical devices and pharmaceutical therapies. Our comprehensive line of products encircles life itself - addressing important health needs from infancy to the golden years.

Abbvie has sales, manufacturing, research and development, and distribution facilities around the world, close to where our customers need us to be. We are recognized for our global reach and our ability to serve our customers around the world.

Key Project Descriptions:

1.ICDS Replication: This Interface is designed to replicate all iRep (Salesforce Veeva CRM Application) objects into Oracle database using ICDS (Informatica Cloud Data Service) by creating ICDS replication tasks scheduled to run for every 6 Hours. The tasks are designed to run near real time with update else insert logic. Also, ICDS tasks designed to alter oracle DDL default if found any change in source iRep object Definition and this will save time and cost by avoiding frequent releases.

2.iRep Call Activity: Interface is used to read iRep Call Activity (Activity made by Abbvie Medical representative with HCP: Health Care Professionals or HCO: Health Care Organizations) information like Call Header, Call Detail, Call Sample, Call Attendee, Call Key Messages & Call Items Dropped delta information from ICDS and load into ODS (Operational Data Store) oracle data base by implementing business requirements with exception reprocessing logic. And then ODS Call Activity Data will audited and mark call complete flag as True if audit success. ODS Commercial Call data will be pulled and load into Stage and then into SMDW (Sales and Marketing Datawarehouse) History tables using TYPE2 SCD. Then read active records from SMDW History and load into base tables which is source for analytical teams like Qlik, MicroStrategy and DSL Teams to generate dashboard reports like KPI SFA Dashboard etc.

3.DPR (Drug promotion repository) & DPR FDA Reports:

Collecting sample product disbursement for Hand Sample data from internal source systems like iRep and ESR (Electronic Sample Request) related sample product disbursement data from external fulfillment vendors like PSI & Knipper responsible for reconciling the delivery of prescription drug samples. Verifying the accuracy of collected sample product disbursement data and storing data to a local DPR repository to generate yearly FDA XML report that needs to be audited by FDA every year.

4.RAR (Request A Rep): As part of Allergan Integration with AbbVie we have designed new interface called RAR which is used to read requests every 30 min submitted by HCPs to contact them by Allergan/Abbvie Reps to know more about the products. We will read and validate all the requests and load HCP Request information into RAR iRep Object which will trigger Alerts to Reps to contact HCPs.

5.KPI QlikView Dashboard: Interface created to build QlikView KPI (Key Performance Indicators) dashboard to track mastered HCP/HCO turnaround time and Inflight HCP/HCO turnaround time, MRD (Moved Retired & Deceased) metric, Count of Non-Prescribers with unknown ABS, Mastered Change Requests (CRs) and Inflight CRs data. Interface created ETLs to source all the KPI data from ICDS replicated objects.

6.1Connect User to Territory Alignment Load Interface: Interface reads Javelin daily JAMS JRM User to Territory Alignment data files and will load into ODS and then will Create/update user to Territory Alignment (New Hire, Rehire, Territory Transfer etc.) in iRep.

7.1Connect Customer to Territory Assignment Load Interface: Interface will extract all Javelin source files like Customer, Customer Address, Customer XREF, Customer Identifier, Call Activity, Call Plan, Product XREF, Approved and excluded Products, Local Adds from SMDW and send to ZS. ZS will compute Assignment using Javelin and publish extracts back to Abbvie. We will validate and consume Customer to Territory Assignment files and load into ODS and then into iRep.

8.DAL to iRep Account and Address Load: Reltio will publish Customer and Account data and we will read Reltio Customer, Address, Identifier, License, Communication, Specialty & Merge data and load into DAL. Then we will identify SFA_FLAG True Accounts and insert/update Account & Address into iRep to assign iRep ID. This will sync back to Reltio through Tibco real time replication after validating Address using Address Doctor web service.

9.PSI Inbound/Outbound Interface:

Interface will read Electronic Sample Request (ESR) information raised by physicians during call activity with Abbvie Rep or Sample Order from requests through I-pad Salesforce application and load the data into ODS ESR Tracking table. Then we will extract and create ESR vendor header & Detail file which is targeted send to Abbvie Sample Order fulfillment vendor PSI(Patheon) schedule to deliver daily 2 times to start FedEx Samples to HCP Address.

PSI will send back return Sample Order Status file with FedEx tracking number and fulfillment status like Shipped/Delivered/AOC Received etc to update back in IRep.

10.ePASS:

ePASS is created to replace the existing Program Tracking System (PTS) being used currently. PTS is an application that provides users the ability to budget and track expenditures for PPD product promotion. The same functionality will now be performed in the ePASS system with the integration of the Finance Management Module.

The ePASS system will require SAP data integration and interfaces in order to perform similar functionality and reporting that is done in PTS today. The SAP data feeds being sent to PTS currently will be used to integrate to ePASS. No changes to the files will be required.

This will load the SAP data (ACR invoice, JV & PO) into ePASS staging tables and trigger the ePASS web services to load ACR invoice, JV & PO data into ePASS backend tables.

ePASS: Electronic Process Accountability and Strategy System

ACR: Administrative Check Request

JV: Journal Voucher

PO: Purchase Orders

.

Responsibilities:

Participated in gathering, analyzing, and documenting business requirements, functional requirements and data specifications.

Interacted with Business Analysts to gather and analyze the Business Requirements.

Analyzed the source data, Coordinated with Data Warehouse Architect in developing Relational Model. Designed and developed logical and physical models to store data retrieved from other sources including legacy systems.

Used Informatica Designer to create mappings using different transformations to move data from source to a Data Warehouse.

Designed and Developed mappings in Informatica using different transformations like Salesforce Lookup, Salesforce Merge, Web service Consumer, XML Parser and XML Generator, Lookup, Joiner, Source Qualifier, Expression, Aggregator, Router, Rank, Filter and Sequence Generator transformations.

Used IICS (Informatica cloud Service) software to create new Cloud tasks to replicate iRep data into Oracle.

Created Update Strategy and Stored Procedure transformations to populate targets based on business requirements.

Created and Configured Workflows, Work lets and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.

Scheduled the sessions to extract, transform and load data into warehouse database as per Business requirements. Improved the session performance by implementing session partitioning.

Used various Index techniques to improve the query performance.

Automated Unix shell scripts to handle exceptions to trigger alert email to support & Business team and to verify the count of records added everyday due to incremental data load for base tables in order to check for the consistency.

Implemented TYPE2 Slowly Changing dimension methodology for accessing the full history of accounts and transaction information.

Implemented performance tuning of mappings by identifying bottleneck on sources, targets and transformations.

Developed Procedures and Functions in PL/SQL used in ETL data load.

Developed QlikView reporting Dashboards like KPI using different ICDS replicated objects.

Involved in Preparing the High Level Design (HLD), and Low Level Design (LLD), Unite Testing Documents (UTD) documents for ETL Informatica process.

Developed several reusable Informatica jobs which can be reused for several processes.

Designed and developed complex mappings that involved Slowly Changing Dimensions, Error handling, Business logic implementation.

Used Autosys for running and monitoring the daily/weekly scheduled loads.

Involved in Version Control of Repository Objects using Check-in and Check-out with MKS Tools.

Utilized Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters for the delta process to extract only the additional data added during that period.

Used SOQL queries to retrieve data from iRep Salesforce objects using SQT (Salesforce Query Tool) & Work Bench.

Worked with mappings to dynamically generate parameter files used by other mappings.

Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session during Hypercare after go live.

Environment: IICS(Informatica Clod Service), Informatica Power Center 10.2/9.6, Informatica IDS, Informatica Web service, Teradata, Reltio MDM, Oracle 12g, Salesforce (SFDC), TOAD, Windows XP, UNIX, SOLR & Autosys.

PayPal, India

Chennai Nov’08 to Jun’10 Informatica Developer

Project Title: PAYPAL-EDW

CLIENT PROFILE:

PayPal is the safer, easier way to pay and get paid online. The service allows anyone to pay in any way they prefer, including through credit cards, bank accounts, buyer credit or account balances, without sharing financial information.

PayPal has quickly become a global leader in online payment solutions with more than 153 million accounts worldwide. Available in 190 markets and 24 currencies around the world, PayPal enables global ecommerce by making payments possible across different locations, currencies, and languages.

PayPal has received more than 100 awards for excellence from the internet industry and the business community -most recently the 2006 Webby Award for Best Financial Services Site and the 2006 Webby People's Voice Award for Best Financial Services Site.

PROJECT DESCRIPTION:

Building enterprise level data warehouse by consolidating/extracting data from existing customer level tables. Aggregating up all payments data at enterprise level and will generate daily/weekly/monthly dashboard reports like Gross Total Payment Volume (GTPV), Net Total Payment Volume (NTPV) using MicroStrategy.

Responsibilities:

Involved in writing/tuning Teradata BTEQ scripts to load data from Oracle DB to target Teradata database by applying business rules and functionalities.

Created UC4 Job plans and scheduled jobs to run BTEQ scripts sequentially.

Involved in preparing LLD & TDD documents by analyzing BRD.

Involved in the designing of mappings by translating the business requirements.

Performed data extract and data transformations using transformations like Lookup, update strategy, sequence generator, Expression, Filter, Aggregate, Rank and Joiner.

Created Sessions and workflows to run sequentially.

Environment: Informatica Power Center 8.6, Oracle 9i, Teradata V2R5, Teradata SQL Assistant UC4 & UNIX

Merrill Lynch (Bank of America) Oct’07 – Sep’08

Chennai, India

ETL Developer

Project Title: EDS-FUSION (Financial Instrument data)

CLIENT PROFILE:

Merrill Lynch is one of the world's leading wealth management, capital markets and advisory companies, with offices in 40 countries and territories and total client assets of almost $2 trillion. Merrill Lynch offers a broad range of services to private clients, small businesses, and institutions and corporations, organizing its activities into two interrelated business segments - Global Markets & Investment Banking and Global Wealth Management, which is comprised of Global Private Client and Global Investment Management.

The client required development of data warehouse on a global scale. This project involved in creating a data warehouse for Equities & Mortgage data from vendors like Bloomberg and Reuters and aimed to enabling the knowledge of the vendors details and to make better and faster Decisions. Provide a set of standard reports on basic operational reporting strategic development

PROJECT DESCRIPTION:

Building Enterprise level data warehouse by extracting data from different vendor feeds like Bloomberg, Reuters and loading into warehouse tables by applying business transformation logic and make data ready for downstream reporting consumption.

Responsibilities:

Involved in Data Analysis to extract useful data, finding patterns and regularities from the sources and develop conclusions.

Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features.

Actively involved in Data Warehouse system design discussions.

Extensively used various transformations like Lookup, Update Strategy, Joiner, Aggregator, Union and Sequence generator transformations.

Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for capturing delta loads

Involved in creating Fact and Dimension table mappings to load Fact and Dimension tables.

Developed Slowly Changing Dimension (SCD) Type1 & Type2 mappings.

Worked with Reusable Mappings (Mapplets), Reusable Workflows (Worklets) and Reusable Transformations.

Worked with Session Logs, Informatica Debugger, and Performance Logs for error handling the workflows and session failures.

As per the requirement, performed Initial full history loading and analyzed data after load.

Used SQL Override in Source qualifier to customize SQL and filter data according to requirement.

Wrote Pre and Post SQL commands in session properties to manage constraints that improved performance.

Performed Pushdown Optimization to increase the read and write throughput

Document all ETL related work per company's methodology.

Involved in system end to end testing as part of the SDLC process and done performance tuning at the mapping level, session level, source level, and the target level.

Prepared estimates and tracked each and every task and strictly adhered to meet estimated deadlines.

Environment: Informatica Power Center 7.1, Oracle 9i, Autosys, Windows XP, UNIX, SQL Developer.



Contact this candidate