Post Job Free

Resume

Sign in

Data Manager

Location:
Houston, TX
Posted:
February 19, 2017

Contact this candidate

Resume:

PANKAJ DUREJA

e-mail : acyv9y@r.postjobfree.com

Present Contact No: +1-512-***-****

Address for correspondence:

**** ******** *******,

Apt 2208

Houston, TX – 77077.

Profile:

Over 10+ years of total IT experience and technical proficiency in building Data Warehouses, Data Integration, Operational Data Stores, ETL processes, big data concepts for clients in HealthCare, Industrial Equipment, Travel, Telecommunication and Retail domains.

10+ years of strong experience in working with Data Warehouse implementations using Informatica PowerCenter 9.x/8.x/7.x, Oracle, DB2, SQL Server, Netezza and Teradata on UNIX and Windows platforms.

Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse using Informatica PowerCenter tools (Repository Manager, Designer, Workflow Manager, and Workflow Monitor).

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows.

Experience in integration of various data sources like Netezza, Teradata, Oracle, DB2, SQL Server, Flat Files, and Mainframes into Data Warehouse and also experienced in Data Analysis.

Experienced in designing and implementing star, snowflake dimensional model.

Experienced in implementing conformed and slowly changing dimension of all type.

Proficient in Informatica administration work including installing and configuring Informatica PowerCenter and repository servers on Windows and UNIX platforms, backup and recovery, folder and user account maintenance

Experienced in Unix System & Shell scripting.

Experience working in production support environment.

Experienced in working with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems.

Hands on experience using query tools like TOAD, SQL Developer &Teradata SQL Assistant.

Expertise in writing large/complex queries using SQL.

Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views and Indexes.

Experience in preparing documentation such as High level design, System requirement document, and Technical Specification document etc.

Excellent analytical, problem solving skills with strong technical background and interpersonal skills.

Working knowledge in hadoop and its stack like HDFS, Apache Pig, Hive, Sqoop, Hbase and Map reduce concepts.

Extensive knowledge in hadoop eco-system technologies like Apache Hive, Apache Pig, Apache Sqoop and Apache Hbase.

Experience in importing and exporting data between HDFS and Relational Database Management

systems using Sqoop.

Good knowledge of hadoop architecture and underlying framework.

Experienced in analyzing data using HiveQL.

Technical Skill Set:

ETL Tools

:

Informatica 9.1/8.6.1/7.x., IDQ.

Reporting Tools

:

SAP BO crystal and Tableau.

Database Tools

Databases

:

:

TOAD, SQL Developer, Aqua Data Studio.

Teradata, Netezza, Oracle, Microsoft SQL Server and Amazon Redshift.

Data Modeling

Big Data

Languages Known

:

:

Dimensional data, Star Schema, Physical and Logical Data.

Hadoop Concepts, HDFS, Hive, Sqoop, Pig, Hbase, Flume, AWS.

Oracle PL/SQL.

Scripting Languages

Scheduling Tools

:

:

Unix-Shell.

Control-M and UC4.

Others

:

Data Warehousing Concepts.

Operating System

:

Unix and Windows.

Education:

Masters in Computer Science (Software) Correspondence from Kurukshetra University, Kurukshetra, Haryana.

Bachelor of Computer Applications from Hindu College Affiliated to Maharishi Dayanand University (Rohtak), Sonepat, Haryana.

Professional Experience:

Organization

Period

Designation

Infosys Technology Ltd.

December 2014 to till date

BI Lead Consultant

Cognizant Technology Solutions

July 2014 to December 2014

Senior Associate

Accenture Technology Solutions

May 2006 to July 2014

Systems Analyst

Currently working as BI lead consultant with Infosys Technology, where my role has been providing Development and Application Support using Informatica ETL and big data technologies. Like to work in a challenging environment that would harness my potentials and focus on the continuous improvement.

Professional Achievements:

Accenture Excellence Award – Client Champion for the first Quarter, 2012.

Achiever of the Month March 2010.

ASE Achievers Award for exceptional performance during the first year (2006-2007) in ACCENTURE.

Trainings:

Amazon AWS Training (November 2016).

Big Data Training (November, 2013).

SAP Business Planning and Consolidation training (June, 2011).

Completed Accenture Greenfield training program (Informatica, oracle) in July, 2006.

Certifications:

ITIL V2 Foundation.

Oracle Certified Associate – Developer.

Brief Description of the Projects:

Project 9: SYSCO, Foundational Data project, USA, Jan 15 to till date

Sysco is the global leader in selling marketing and distributing food products to restaurants, healthcare and education facilities, lodging establishments and other customers who prepare meals away from home. The Sysco Foundational data project is being implemented in AWS to bring data onto Redshift stack from various SUS and SAP operating companies. Responsibilities Include:

Involved in requirement analysis, design, development and implementation of ingesting data into Amazon Redshift from SUS & SAP broadline operating companies.

Complete understanding of the legacy system for building clear and error free mappings, extraction routines.

Implemented Informatica ETL tool to load data into Amazon Redshift and S3.

Upgraded the Informatica Power center from 9.1.0 HF4 to 9.6.1 HF4 on premise.

Installed the Informatica PWX for Redshift.

Supported the existing ETL framework (Informatica) to gain understating of different data marts in order to bring data into Amazon.

Involved in performance tuning of SQL statements using explain plan.

Developed shell scripts to load data to amazon redshift using copy commands.

Developed control-m jobs to automate loading process.

Prepared the run workbook to implement the data loading process on Amazon Redshift.

Code change migration from Dev to QA and QA to Production.

Work closely with technical team to ensure technical solutions are met and any gaps are identified in early stages of development.

Involved in changing the Custom SQLs for Tableau reports.

Project 8: Next Generation Data warehouse Project, Charter Communications, USA, Aug 14 to Dec 14

The Charter Next Generation Data warehouse is being developed to replace their existing Enterprise Data warehouse system using the big data concepts. The idea behind this system was to bring data from relational and non-relational systems such as set top box logs into Hadoop and it can be used for the analytic purpose. Responsibilities Include:

Interacting with clients to understand the various source systems tables to be staged.

Staging the tables required for reporting before loading the core database.

Development of Informatica mappings for extracting source data into staging layer.

Development of Teradata SQL procedure to load data into persistent tables.

Development of Sqoop commands to push data on to core database.

Involved in loading data from set top box logs system to HDFS using flume.

Automated the task of bringing the data and logs into HDFS.

Developed Hive SQLs based on the customer requirement.

Gained very good business knowledge on different category of products and designs within.

Project 7: Integrated Eligibility, State of Ohio, USA, and June 13 – July 14

The Integrated Eligibility team is responsible for converting the State of Ohio legacy data into Accenture Benefits Management System. The new system (ABMS) will streamline processes and enhance technology for eligibility determination and the delivery of benefits to citizens. Beneficiaries will have new self-service online access options. It also will facilitate data-sharing among state agencies and offices, providing the state and its county partners with new capabilities to enroll people and manage human service operations. The Human Services suite of products from Accenture Software will be used, enabling the State of Ohio to meet federal guidelines (Obama Care) and to extend the system over time to support additional programs beyond the initial inclusion of Medicaid, the Supplemental Nutrition Assistance Program and Temporary Assistance for Needy Families. Responsibilities Include:

Creation of source to target data mapping design specifications.

Developing Informatica ETLs for transforming data from legacy to target tables/fields.

Developing data validation and data reconciliation SQL queries to confirm appropriate transformation of data and identify issues in source data for data cleanup.

Creation of technical Design Specifications.

Creation of Code references, error & rejects handling and parameterized table.

Participating in code reviews and peer reviews.

Escalating issues to management as identified.

Fixing defects identified in testing.

Assist Functional Lead with design, build and test activities for a Project Phase or Program.

Develop content for Data Conversion Deliverables and Work

Project 6: EPM, Intertek, USA, and May 11 – May 13

This system is known as EDW (Enterprise Data Warehouse or Intertek Data Warehouse). This system is designed and built for the Intertek, which extracts the data from various different sources (Phoenix, PeopleSoft, Report Manager, Cognos, and ECS) and loads into the Single Integrated Data Warehouse. Once the data is loaded, the Cubes are built and the PeopleSoft financials Reporting are done through SAP Business Planning and Consolidation tool and the various transactions reports are done through SAP Business Objects Crystal tool which will take the data from the EDW, so that the contention on the source system is reduced. ETL steps are performed by Informatica. Responsibilities Include:

Analyze the business requirements and prepare functional specifications for the new source table.

Assist the team in development of ETL Code for new source table.

Assist the team in preparation of Unit test plans and unit testing of the ETL code for new tables.

Peer review of code and document review of necessary deliverables.

Monitoring the daily batches which loads the data into EPM data warehouse.

Providing support for the EPM process for resolving the day to day issues.

Assist and monitor the operations team in their daily loading activities.

Production Implementation of the newly developed ETL Code.

Tracking the daily loading status and updating the business about the same.

Solving incidents within the SLA (Service Level Agreement).

Executing and Monitoring Data Manager Packages which built the cubes for the SAP BPC applications (Finance, Payables, Sales).

Creation of new users for access to SAP BPC System.

Updating of Dimension members on day to day basis.

Fixing SAP BPC reports when user faces some issues.

Project 5: TORPEDO, Thomas Cook, UK, Apr 09 – Apr 11

The TORPEDO application (Tour Operator Revenue Pricing & Expected Demand Optimizer) aims to implement a Tour Operator Yield Management System in the UK Holidays Division. A new component called DTE (Data transformation engine) was developed which loads, transform and validate the data. After performing the ETL steps, it loads the data into third party system called TRO. TRO is provided by JDA providing the Revenue management system. The TRO enhance holiday booking revenue and margin through automation of price and inventory controls. The sources of the data are the flat file which comes from various third parties i.e. TOPICS, TOS, and Navitaire etc. SQL Loader is used to load the data from the files into database system. ETL steps are then performed by Informatica. Responsibilities Include:

Analyze the business requirements and prepare technical specifications for the new tabins (ETL).

Created Informatica mappings and reusable transformations.

Created session & scheduling of session for loading data.

Created various Transformations like Lookup, Joiner, Aggregate, Filter, Expression,

Router and Update Strategy.

Involved in preparation of test cases and testing of mapping.

Prepared ETL mapping documents.

Prepared the Data dictionary and Source to Target Document.

Peer review of code and document review of necessary deliverables.

Project 4: SRS Crew CSV Extract, Thomas Cook, UK, Jan 09 – Mar 09

This application is used to extract the report for the Cabin crew activities as and when desired. The front end is being designed in Oracle Apex (Application express) and the backend procedure is written in the PL/SQL package using Oracle technology. The report depends on the parameters given in the front end. The report is mailed to user in CSV format automatically. Responsibilities Include:

Analyze the business requirements and preparation of the functional and technical specifications for generating the new report.

Design and prepare PL/SQL Code for the new report.

Preparation of Unit Test plans and unit testing of the PL/SQL code.

Production Implementation of the newly developed code.

Solving incidents within the SLA (Service Level Agreement).

Creation of new users for the Oracle Apex workspace.

Project 3: Distribution MI, Thomas Cook, UK, Oct 08 – Mar 09

This application is used to load the bookings data received on daily basis. The transaction is classified as new bookings, amendments or cancelled bookings. The data comes from the terminals from the MyTravel shops into the Mainframes system through COBOL scripts. The SQL feeds loads the data into the staging area which acts as a source for the Distribution MI. The job loads the data in two phases. Once at 3 PM into the staging areas. The other job runs at 11 PM which compares the existing data in staging area and loads into the LIVE database. The LIVE database is then used to build the cubes with the help of cognos reporting tool for the Management information. Responsibilities Include:

Executing the daily job which loads the data into DIS & MI database.

Providing 24X7 on call support for the Distribution MI process for resolving the day to day support issues.

Solving incidents within the SLA (Service Level Agreement).

Maintenance of the existing PL/SQL Code for doing the validations and processing of the data.

Preparation of Unit Test plans and unit testing of the PL/SQL code.

Project 2: ASL (Account Setup & Load), United Health Group – Optum, USA, Jul 07 – Sep 08

This application is being designed to overcome the problems of existing eligibility systems i.e. to overcome the problems of manual work, providing accurate reports and keep track of fallouts at each stage of the process. The major purpose of this project is to load clean data into systems. The front end is being designed in JAVA on spring/hibernate framework. The Back end is being designed in Oracle database using SQL/PL-SQL. Responsibilities include:

Data modeler for the ASL application.

Performed analysis of the requirements in order to implement the client request for ASL application.

PL/SQL Developer in the ASL application.

Developed packages, procedures and functions for ASL Backend data processing.

Designed the technical documents for the ASL application.

Created unit test cases documents for testing the backend packages, procedures and functions.

Worked on the setting up of Data for ASL application.

Project 1: Eligibility Operations, United Health Group – Optum, USA, Jul 06 – Oct 07

This application was designed to load member and subscriber information as received from the customers into the Eligibility system which aimed at determining the eligibility of the members for the various health care services being provided by the client. All the various other front end applications would access this system to determine the authenticity of the members and the type of services they were entitled to. The system also kept track of all the services and provider information for all the existing members. Responsibilities include:

Executing the ETL mappings/sessions/workflows using Informatica.

Executing the perl scripts for merging the files received by business.

Executing the batch jobs for loading the data into Facets database.

Solving helpdesk tickets.

Providing the weekend support.

Tracking the daily loading status and updating the business about the same.

Generate adhoc reports and data extracts as requested by the business.

Personal details:

Date of Birth

:

01-11-1984.

Father’s Name

:

Late Sh. Manohar Lal Dureja.

Sex

:

Male.

Marital Status

:

Married.

Nationality

:

Indian.

Languages Known

:

English and Hindi.

Passport No

:

N1751462

Permanent Address

:

16/473 Gali No 1, New Ashok Nagar, Sonepat, Haryana – 131001, India.

Visa

:

Valid US H1-B Visa till 06-Dec-17.

Valid US Business Visa till 20-June-21.

Hobbies:

Playing and watching Cricket.

Listening to music.

Declaration:

I, hereby affirm that the information furnished above is true to the best of my knowledge.

Place:

Date:

SIGNATURE

(Pankaj Dureja)



Contact this candidate