RONAK. R. SHAH
USA: +1-971-***-****
EMAIL ID: *.*.******@*****.***
OBJECTIVE: Obtain a challenging position as a Data warehouse/BI Developer where I can use my academic skills and corporate experience to contribute to the company’s success. EDUCATION:
Master of Science in Computer Science Jan 2009
Stevens Institute of Technology, Hoboken, NJ (GPA: 3.4 / 4.0)
Bachelor of Engineering in Computer Engineering Jun 2007 Sardar Patel University, Vadodara, India (GPA: 3.63 / 4.0) CERTIFICATIONS
IBM Certified Cognos 8 BI Author – Report Studio
TECHNICAL SKILLS:
ETL Tools: Informatica 9.6,9.1,9.0,8.6,8.1,DAC (Datawarehouse Administration Console), IDQ
BI Tools: OBIEE 10g, Cognos 8 (Report Studio, Framework manager, Query Studio), Transformer,Tableau
Testing Tools: HP Quality Centre, Mantis, Superversion, Bug Tracker
RDBMS: Oracle 10g, SqlServer 2003,sql* loader,DB2, MySQL, MS Access, ControlM WORK EXPERIENCE:
Accenture Services Pvt Ltd
Client: EMC Corp
Position: Datawarehouse/ETL Developer
Environment: Informatica Power Center 9.1,9.6, Informatica Data Quality, ControlM (Scheduling tool), SQL Developer, HP Quality Center
Description: EMC Corp is a Fortune 500 corporation that provides businesses to build Web-based computing systems with its data storage products and services. The MAP (Master Account Profile) project implemented a Customer HUB using Master Data Management to consolidate the Customer data from multiple sources into a single standardized format. The project has been deployed to Production and is currently in Production support and Stabilization phase.
Acted as ETL Lead for managing ETL related work in iMAP. Was the Primary POC during Informatica upgrade from 9.1 to 9.5. Ensuring connections for ETL jobs, paths to paramfiles /scripts / stored procedures were maintained.
Analyze, design, develop, test and configure the Data Warehousing applications.
Ensure daily loads run smoothly. Monitor incremental jobs in ControlM and Troubleshoot Production failures if any
Create resolution docs for the team members for major failures encountered
Utilized Informatica Data Quality for building mapplets which performed address cleansing,
Deliver Work Requests, change requests and Breakfixes within Time and Budgets
Performance related to execution of the jobs, time taken, data quality was verified and jobs were rerun; retested in Informatica 9.5 in Dev, Test, Perf and Prod environments.
Modify the existing ETL code (mappings, sessions and workflows) as per the user requirements. Monitoring the workflows/mappings in Informatica for successful execution
Develop mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence Generator.
Conducted 5 Trainings in Informatica to a classroom of 25 participants. Training included implementation of workflows, mappings, transformations, performance tuning and other concepts
Created Test Case documents, test scripts, identified bugs and recommended fixes RONAK. R. SHAH
USA: +1-971-***-****
EMAIL ID: *.*.******@*****.***
Acrotrend Solutions Inc Feb 2011- April 2013
Client: Glaxo Smith Kline
Position: Datawarehouse /OBIEE Developer
Environment: Informatica Power Center 8.6.1,9.0, OBIEE, Siebel CRM, DAC (Datawarehouse Administration Console), SQL Developer, HP Quality Center,Informatica Data Quality 8.6 Description: GSK is the UK based Pharmaceutical and health care company. The project takes care of the daily activities carried by the MR (Medical Representative) and FLSM (First Line Sales Manager). The aim of the INSIGHT project is to facilitate more comprehensive and ordered retrieval of information, by allowing more detailed linguistic analysis of the information and creation of the opportunity to identify novel insights for Sales. The functioning of the project works as follows: The daily activities carried out by the MR and the FLSMs i.e. Data capture is done in the Siebel application called as IJSFA in the EMAP Region and SALSA in the EU region. This data is then transferred to the Data Warehouse by the ETL Tool – Informatica which in turns uploads the data in OBIEE DB. This data is validated on the respective Dashboards and Answers in OBIEE (Oracle Business Intelligence Enterprise Edition tool) as per the functionality.
Subject Matter Expert for Informatica Power Center 8.6.1 for GSK Sales Team. Worked as a Mediator between Development Team and the Testing Team
Analyze, design, develop, test and configure the Data Warehousing applications. Conduct unit testing, integration testing of the mapping and writing Unit and System Test Plan in HP Quality Centre
Create Sessions and Workflows to load data from the Siebel Application called as IJSFA in EMAP Region and Salsa in EU Region to Data Warehouse. Same is carried out for uploading data in OBIEE Data warehouse which is then used in OBIEE for Data analysis and insights of Sales for GSK
Utilized Informatica IDQ to complete initial data profiling and matching/removing duplicate data. Designed and developed ETL and Data Quality mappings to load and transform data from source to ODS using IDQ 8.6
Generate reports in OBIEE once the data is loaded for validation of ETL implementation
Responsible for working with the Client Migration team and plan OBI, Informatica and DAC migration.
Following Release notes to backup-restore/migrate Informatica repository and monitoring loads in DAC
Liaise with Development team to take corrective actions in case of ETL load failure and performance issues.
Coordinating between Development team and Release management team for timely releases.
Created Interactive Dashboards in OBIEE 10g utilizing various features such as queries, Charts, pivot tables, Column Selectors, View Selectors, Prompts, and Presentation Variables etc.
Participate in weekly end user meetings to discuss data quality, performance issues and ways to improve data accuracy and new requirements
Created defects, logs for the Issues encountered in Quality Centre during Release 9.5 and Release 9.6
Executed SQL Queries in SQL Developer for data validation at the backend.
PNC, Pittsburgh, PA, USA May 2010- Jan 2011
Position: Datawarehouse (Informatica) Developer/Systems Analyst Environment: Informatica 8.6.1, Oracle 10g, TOAD, Mercury, Microsoft Sharepoint, Lotus Notes, CA7, UltraEdit, Attachmate, Harvest, Informatica Data Quality
Description: PNC Financial Services Group, Inc. is a U.S. based financial services corporation,is the sixth largest bank in the United States. The project ‘Loan Purchase and Loan Valuation Warehouse’ mainly involved working in the Finance and Technology group on the PNC – National City Integration Initiative. Project dealed with migration of NCC Loans to PNC Loans and tranforming all the transactions and data into the Financial Data Store and Loan Valuation Warehouse of PNC
Analyze, design, develop, test and configure the Data Warehousing applications. Conduct unit testing, integration testing of the mapping and writing Unit and System Test Plan
Part of the Production Execution Team (workflows/mappings/sessions), Oracle objects (functions, stored procedures, DDL queries), UNIX objects (shell scripts) and mainframe jobs from Test to QA to PROD. Migrating Harvest packages (UNIX shell scripts) from Test to QA to Production Environments. Creating PTMs (Production Turnover Management) and DTG (Database Technology Group) requests for developers in order to migrate objects to the Production Environment. Coordinating with the Informatica Administrators for successful migration
Create Sessions and Workflows to load date from the Oracle Databases that were hosted on UNIX servers. Run shell scripts in Production, QA and Test Environment for developers by utilizing tools like Ultra Edit and Attachmate
Write complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look- ups to improve the performance as the volume of data was heavy. Executing jobs in CA7 scheduler (Mainframe Scheduler ) RONAK. R. SHAH
USA: +1-971-***-****
EMAIL ID: *.*.******@*****.***
Serve as main point of contact between the developer and administrators for communications pertaining to successful execution of job Resolve issues that cause the production jobs to fail by analyzing the ETL code and log files created by failed jobs on the Informatica Server. Utilize the Lotus Notes for communicating with the developers as well as the managers and upload documents in SharePoint
Worked on Infomatica Data Quality to resolve customers address related issues.
Provide backup support of databases, ETL applications and reporting environment
Involved in analysis of data for creation of test data and improving data quality by embedding logics in mapping programs. Create high level technical documentation. Involved in developing design documents and mapping documents.
Participate in weekly end user meetings to discuss data quality, performance issues and ways to improve data accuracy and new requirements
Created the Production Execution Calendar for the Execution Team which involved creating strategies for the Execution of monthly jobs and migration of different objects from Test to QA to Production environments. UPMC, Pittsburgh, PA, USA Jan 2010-April 2010
Position: Informatica/Cognos Developer
Environment: Informatica 8.6.1, Cognos 8.4 Report Studio, Query Studio, Oracle 10g,TOAD, PL/SQL Description: UPMC is an integrated global health enterprise headquartered in Pittsburgh, Pennsylvania, having One of the major health insurance services division. The project mainly dealing with the Financial and IT Administrative data of the medical centre and the Treatment and Diagnosis provided in the Cancer Center.
Source system evaluation, standardizing received data format, understanding business/data transformation rules, business structure and hierarchy
Involved with the Business Analysts and the Project Manager team for acquiring knowledge transfer and responsibilities for the project
Created workflows and tasks in Workflow Manager and linked the database through server setup and various other connections.
Did various data load simulations to stress test the mappings
Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions
Implemented slowly changing dimension methodology and slowly growing targets methodology for modifying and updating account information and accessing them
Ran PL/SQL queries in TOAD to view the data
Worked with Type 1 and Type2 Dimensions and Data warehousing Change Data Capture (CDC)
Involved in creating test cases and validating the mapping and the workflows
Involved working closely with Cognos Architect and Database Administrator.
Created financial reports for the Cancer Center for Radioation Oncology, Medical Oncology, HOA, OHA etc
Created Z reports which contained employee details and diagnosis carried out using filters,prompts,queries etc
Developed HR reports-MTD, Current YTD, Fiscal YTD, Total Fiscal YTD,QTD reports
Created interactive dashboards with bar charts, list charts, prompts, filters, conditional formatting, drill through and master detail relationships
Performed data validation and verification for the Cognos reports using TOAD
Extensively used prompts, filters, cascading prompts, calculations, conditional variables, multiple queries for data extraction in reports.
Wrote and executed the test cases for various reports that were developed in Cognos.
Involved with making changes in the reports generated as business requirements changed and carry out validation and verification of the reports
Wrote PL/SQL queries in Oracle. Carried out performance tuning of reports
Carried out data validation and data verification using Oracle database and testing in TOAD RONAK. R. SHAH
USA: +1-971-***-****
EMAIL ID: *.*.******@*****.***
Independent Health, Buffalo, NY Sept 2009- Jan 2010 Position: Informatica/Cognos Developer
Environment: Informatica Power Center 8.6, Oracle 10g, Erwin 4.0, Cognos 8.4 Report Studio, Query Studio, Transformer, pmcmd scripting, Shell Scripting, TOAD, PL/SQL, UNIX Description : Independent Health is a health solutions company headquartered in Buffalo, N.Y. offering HMO, PPO, point of service, flexible spending accounts, Medicare and Medicaid managed care plans and coverage for self-funded employers. The project mainly involved dealing with the Member Enrollment data, Member Extract data, Eyemed Claims data, Lab Results data and Healthcare Plans Data. The project under which I was working was called HEDIS 2010 in which member and claims data was extracted into a format defined by the VIPS (Viable Information Processing System)
Source system evaluation, standardizing received data format, understanding business/data transformation rules, business structure and hierarchy, Creation of Reports in Cognos, data transformation through mapping development, validation and testing of mappings.
Involved with the Solutions Architect and the QA team for validation and verification of the development
Developed Technical Design Documents
Did various data load simulations to stress test the mappings
Involved in monitoring Informatica jobs/processes using Workflow Monitor
Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length, and target based commit interval.
Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions
Implemented slowly changing dimension methodology and slowly growing targets methodology for modifying and updating account information and accessing them
Worked with Type 1 and Type2 Dimensions and Data warehousing Change Data Capture (CDC)
Involved in creating test cases and validating the mapping and the workflows
Working with the Business Analysts and the testing team for any changes in the requirements and validating the mappings and workflows
Wrote shell scripts to backup the log files in QA and Production
Created mapplets in place of mappings which were repeatedly used like formatting date or data type conversion.
Worked with various modules of insurance during the project like Member Enrollment, Member Extract, Eyemed Claims, Lab Results, Healthcare Plans
Created Reports in Cognos 8.4 Report Studio dealing with Member Enrollment, Member Extract data and Pharmacy Data Mart
Used Toad for verification of data in reports against Oracle as datasource and created test cases for the same
Interacted with the QA team, Analysts in the status update meetings, agile meetings for report specifications and verification of the user requirements
Followed naming conventions according to Company’s standards and practices
Created dashboards for the Pharmacy data mart project using master detail, filters, drill through and various other functionalities in Cognos Report Studio 8.4
Created Drill Through Reports & Master Detail Reports for HEDIS 2010
Hoffman-La Roche Inc. (Roche Pharmaceuticals) - Computer Scientific Intern June 08- Aug 08
Reviewed custom database program written within the Bioanalytical section in Research and Development department using Microsoft Visual FoxPro
Monitored the database system in LCMS scheduling of the department
Worked in Toad, Oracle for accessing the the database for the Analytical section in Research Department
Ran PL/SQL queries in TOAD to view the data. Created stored procedures, functions, triggers in TOAD
Assisted in developing automotive template for report writing using Visual Basic 6.3 in Macros in Bioanalytical section
Performed testing on macro documents using Load Runner. Created test cases, test scripts and test plans and analyzed the data.Involved in JAD sessions, RUP, Agile methodologies during meetings and developed strategies for development of creation of automative templates using Macros
Gave a technical presentation on the working of the templates to Managers, Superiors and Developers