Post Job Free
Sign in

Data Manager

Location:
Hyderabad, Telangana, India
Posted:
January 12, 2017

Contact this candidate

Resume:

MALLIK *******.**@*****.***

Professional Experience:

Around 15 years of experience in Software Development and Design in the areas of Commercial and Business Applications as a informatica Architect, Technical Lead, & and Application Developer.

Implemented successfully 5 very large scale End-to-End Data warehouse for fortune 500 companies

14 years of solid experience in using Informatica Power center, IDQ, B2B, Power Exchange and Analyst.

Experience in extracting and integration of various data sources like Oracle, DB2, Flat Files, XML, JMS and COBOL files.

Hands on experience on all aspects of data quality: profiling, cleansing / standardization, normalization, de-duping, merging / consolidation, fuzzy matching, data integration and reconciliation.

Strong knowledge in writing complex SQL queries and SQL Tuning.

Worked on data warehousing projects from inception to implementation and delivered high quality solutions in a timely manner meeting and exceeding the business requirements. This involved working with the top management and executives, business sponsors and IT department.

Responsible for defining the information architecture and technology infrastructure along with the design of multi-dimensional data models and the Business Intelligence delivery solutions

Responsible for managing the data analysis, data sources, data mapping along with defining the meta data repository

Provided leadership roles including client mentoring

Knowledge in OLAP tool like Business Object, Cognos

Worked on the Informatica Administration.Setup a Grid in production environment to implement the High Availability functionality and resolve production performance issues & meet SLAs

Experience in writing UNIX shell scripts

Experience in Control-M, Tivoli, AutoSys, Espresso schedule and VSS (Version Control).

Proficient in EDW-Data Integration from MDM systems.

Flexibility in adapting to any new technology.

Soft Skills:

Excellent verbal and written communication skills.

Strong ability in identifying problems, developing new ideas, and finding the best solutions to suit in the prescribed environment.

Worked effectively in independent and part of a team.

Logical, Analytical and Good Interpersonal skills, Commitment to perform quality work, and an excellent team Player.

.

Technologies:

ETL

Informatica Power Center 9.x,Data Quality 9.x,DT Studio, Power Exchange

OLAP Tool

Business Objects, Cognos

Data Modeling

ERWIN

RDBMS

Oracle 11g, Teradata,DB2,GreenPlum

Programming Language

SQL, PL/SQL,Java, Unix Shell scripts

Operating System

Sun Solaris, HP-UNIX, Linux, Win NT/2003/2008

Testing

Test Director, Quality Center, JIRA

Scheduling Tools

Control-M,Espresso,AutoSys

Other Tools

TOAD,Remedy,Clarity, Visio,Appfluent

WORKING EXPERIENCE:

AMERICAN FAMILY INSURANCE, Madison, WI

Project Title: CDH (Customer Data Hub)

Duration Mar 2015- Till Date

Designation: ETL Lead

Team Size: 10

Environment: Informatica Powercenter 9.6, Data quality 9.x,MDM 10.X, Power Exchange,UNIX, Oracle 11g, Green Plum, Harvest, Autosys,JIRA, Java, Protegrity

The objective of the Customer Data Hub project is to create a single source of customer data. This will be accomplished by building a CDH that integrates cleansed and standardized customer data from sources across the enterprise. Also CDH will be one facility to view and maintain “gold” customer records, thus creating a single version of the truth for customer data. Creating this gold record enables significant business functionality. It will also reduce future integration expenses associated with the acquisition and deployment of new systems, thereby speeding time to market.

Responsibilities:

Responsible for analyzing various systems data and developing ETL solution to integrate into CDH.

Participating in ETL design phase and creating ETL Design Specs for mappings.

Developed the REAL TIME power center mappings and publish the message queues from JMS application using DT Studio.

Created various power center mappings to handle the complex customer data and load into Landing tables.

Carried out extensive profiling of new and existing data sources to detect anomalies and errors before developing the corresponding solutions to sanitize and remedy the data source.

To deliver high-quality master data information developed and implemented data standardization, validation, enrichment, de-duplication, and consolidation rules.

Implemented various data masking techniques to secure the sensitive production data.

Experience in using Labeller, Association, Consolidation, Match and Merge data quality transformations.

Participated in the Code reviews and create harvest packages to deploy the code into productions.

Co-ordinate the development,testing and business teams during integration/UAT.

HEALTHCARE MANAGEMENT SYSTEMS (HMS), Irving,TX

Project Title: WCB Data warehouse

Duration Nov 13- Mar 15

Designation: ETL Lead

Team Size: 25

Environment: Informatica 9.5.1, UNIX, OBIEE, DB2, Harvest, Tivoli

HMS powers the healthcare system with integrity by providing cost containment solutions for the federal and state governments, commercial insurers, and other organizations. The Irving, Texas-based company is a wholly owned subsidiary of HMS Holdings Corp. Focused exclusively on the healthcare industry since 1974, HMS helps ensure healthcare claims are paid correctly and by the responsible party, and that those enrolled to receive benefits qualify

Built WCB data warehouse for WCB business. It includes regular client claims data refreshes, prepareMicins to submit to WCB board, cleansing and loading Micout data, processing WCB case information, carrier and financial information like checks and transactions including design for front end Maestro.

Responsibilities:

Analyze User Requirement and Design data integration solutions.

Provide direction to team on solution approach.

Prepare high level and low level design documents.

Peer Review the Deliverables and make sure that on time delivery with quality

Communicate with all stakeholders on delivery status

Involved in the ETL development and provide the technical assistance to the team.

Create weekly project stakeholders to resolve issues and manage risks

Work with QA team on defining test approach

Handling L1,L2 and L3 support and make sure meet the SLA’s

Performance tuning of the existing long running sessions/jobs.

Perform impact analysis and handle ETL change requests based on the business rules.

Coordinating the Onsite and Offshore team.

Involve in researching and resolving production issues

Perform the multi-tasking activities like Design, Development, Production support, Administration and Handling the production tickets

BIOGEN IDEC, Boston,MA

Project Title: GLOBAL COMMERCIAL DATA WAREHOUSE

Duration Aug 08- Oct 13

Designation: ETL Lead

Team Size: 10

Environment: UNIX, Informatica 8.6.1, Business Objects, Oracle 10g, Siebel, VSM, Appfluent

Biogen Idec is Leading Health Care Company in the globe.Objective of this project is to generate the reports on commercial data to help the commercial team to take the strategic decisions.

Extract the data from the Siebel system and loads into the GCDW(Global Commercial Data warehouse). Many data marts has built on the GCDW data.

Responsibilities:

Involved in gathering requirements and created design documents and mapping documents.

Designed the ETL mappings and implemented High Availability features in Data warehouse

Perform the multi-tasking activities like Design,Development, Production support, Administration and Handling the production tickets

Assign task to the Team and monitor the development status and send the status report to the Manager

Mentor the team and help them if any technical assistance is required.

Involved the Informatica Administration tasks.

Identified the long running SQL’s and fine-tuned those SQL’s with the help of appfluent tool

Handling L1,L2 and L3 support and make sure meet the SLA’s

Involved in the development activities for complex tasks.

Motivate the team in the tight schedules.

Suggested many improvements to clients and implemented those suggestions in GCDW application.

Improved the Batch load timings by 50%

Played the major role to change the project model from staff augmentation to the managed services.

Code reviews & testing of the systems as per the standards.

Oncor ED, Dallas, TX

Project Title: Oncor ED CDW

Duration Apr 07 –Jul 08

Designation: ETL Lead

Team Size: 10

Environment: Unix, Informatica 8.1, Cognos 8.0,Oracle 10g,Espresso,Clear Quest

Oncor Electric Delivery, a regulated subsidiary of TXU Corp., delivers electricity using industry leading transmission and distribution capabilities, regardless of a consumer's retail electric provider Oncor Electric Delivery provides power to over 3 million electric delivery points. The purpose of the enterprise data warehouse initiative is to build out the technologies, tools and processes needed to support a scalable flexible and robust information platform to satisfy the Oncor ED information needs.

The primary goal of the project is to lay the foundation for a business intelligence platform for Electric

Delivery encompassing tools, technology and architecture

Replace the current Polaris Data Warehouse by building a customer centric data warehouse (CDW)

which can be expanded to include other areas of business such as field operations.

Improve operational efficiency by usage of robust and timely information operational and analytical

capabilities to be provided by the new business intelligence platform.

Responsibilities:

Analyzing the requirements and creating Technical design specifications and mapping documents.

Developing, Testing and Validating ETL mappings, UNIX scripts.

Scheduling jobs through Espresso

Co-ordinate with Testing team and resolve the defects raised during SIT

As part of the Informatica admin install and configure informatica server in various environments, migrate the mapping into different environments, Backup repository

Played the the role of Process Quality Lead .Co-Ordinate with QA team and provide the metrics to the QA team and also to Project Manager.

Participate the QA and CMMi audits

Code reviews & testing of the systems as per the standards.

Dell Computers, Austin,TX

Project Title: CUSTWATCH & SVCWATCH

Duration Nov 05 –Apr 07

Designation: TeamLead

Team Size: 5

Environment: UNIX, Ab-Initio, Teradata V2R5, Oracle 9i, Control-M, EME, Test Director 8.0

The goal is to retire the Oracle mart GCPP while offering the same service (data, and report ability/features) to the customer thru D3.

The customers perform an additional layer of rollup to the data before running their reports, so in addition to replicate the current rollups owned by DDW, we will have to assess, document and replicate the changes run by the customers as well and develop the features they require in D3.

Responsibilities:

Analyzing the requirements and creating Technical design specifications and mapping documents.

Developing, Testing and Validating ETL flow Abinitio graphs, Teradata-BTEQ &UNIX scripts.

Performing EME check-in/check-out through Abinitio air commands.

Loading the transformed data file into TD staging tables through TD Utilities.

Creating TD macro's for loading the data from staging to target tables.

Tie-out report generation.

Scheduling jobs through Unix-wrapper template and Control-M.

Unit testing and validation of TD warehouse data as per standard test cases and bug fixing.

Creation of Staging and Target Teradata objects.

Performance tuning.

Code reviews & testing of the systems as per the standards.

Played the role of QCO (Quality Coordinator)

WILLIAMS-SONOMA, San Francisco, CA

Project Name : CDW

Duration Aug 04 –Oct 05

Designation: Developer

Team Size: 5

Environment: UNIX, Oracle 9i, Informatica Power center 7.1, Business Objects 5.1.

WILLIAMS-SONAMA, headquartered in San Francisco, California is one of the largest retailers in U.S.A. The Marketing Management systems deals with managing marketing related information for all of the brand concepts of Williams Sonoma namely -

Williams-Sonoma

Pottery Barn

PB teen

Potter Barn Kids

Hold Everything

Williams Sonoma Home

West Elm.

The main objective of the project is that Migrate the existing PL/SQL code to Tool based approach.There are some stored procedures in RDW system written in Oracle PL/SQL. These Procedures used to Load the data from RDW system into Targets. We replaced the stored procedures With ETL Tool (Informatica) We created some reports like Sell thru, Store Detail Inventory etc ..These reports will helpful for finance and marketing departments to analyze the business.

Responsibilities:

Business Analysis and requirement gathering.

Source System Study

Gap Analysis and Data Profiling.

ETL Lead

ETL Architecture

Part of a schema designing and re-designing team.

Build complete ETL Specifications and logic buildings.

Impact Analysis on a change request.

Developed various kinds of mappings (simple and complex including SCD's Type 1/2/3).

Optimizing the mapping through SQL query override.

Optimizing the mapping to load the data in slowly changing dimension.

Perform debugging and tuning of mapping and sessions.

Scheduled and configured workflow and Session task to load data.

Session tuning.

Testing mapping to assure that data is loaded as per ETL specification.

TOSHIBA AMERICAN BUSINES SOLUTIONS, Irvine,CA

Duration: Sep 03 - Aug 04.

Designation: Senior Developer

Team size: 6

Environment: UNIX, Oracle 9i, Informatica Power center 6.1, Cognos

Description:

System gathers all the sales transactions from Oracle ERP system on a periodic basis (every 30 minutes) and reports the ways Sales, Marketing, Finance and Executive Organizations would like to see, by providing hierarchical views of Sales by Channel, Sales by Product Line and Product Models, Sales by Representatives, Ordered, Back Ordered, OnHold, Shipped quantities etc. to up to the last minutes accuracy. The system also provides the performance measurements such as Dealer Quota and Dealer fulfillment, Product, Channel and Rep level budgets and actuals, and also provides the Dealer Ranking reports.

Source system, Oracle ERP, manages different transaction workflows for different Order types (based on Sales Channel, Order Type, and Order Statuses). The ETL addresses all these workflows and loads into the Data Repository with unified Data Warehouse model. The ETL also addresses the different relationships in Source systems for Product, Territory, Channel, Customers/Locations, Rep/Role associations and loads into the Data Repository with fixed hierarchical representations of these dimensions for easier reporting and ad-hoc analysis.

System provides pre-generated reports, which appeal to the majority of the users, and also provides the Power Play Cubes for additional ad-hoc analysis using the slice-dice techniques, advanced filtering etc. All of these reporting mechanisms address the data security to show only the items, which a user is responsible to be visible to him/her. System utilizes the Access Manager/ Directory Server's User Classes and Membership associations to achieve this.

Responsibilities:

Design and creation of catalogs. Defined User Classes/Security and Privileges.

Created Reports in Impromptu and distribution of catalog to users for ad hoc reporting and analysis.

Extraction, Transformation and Loading of the data Using Informatica Power Center.

Extensively used workflow manager to organize sessions, scripts and commands.

Extensively worked in the performance tuning of the programs and ETL Procedures.

Worked with flat file sources and relational sources.

Extensively worked on understanding the business requirements and

designing applications to meet their needs

Created Dimension maps for Transformer and Created multidimensional Cubes in

Power play.

Involved in Creating and deploying of Power play web reports

Created Standard and Ad-hoc Reports using Cognos. The reports were created using the slice & dice and drill down features and the reports included cross tab, Master Detail and simple reports.

Generated decision support models for analysis using Cognos Powerplay

Involved in performance tuning of Impromptu Reports

Worked on publishing the reports in web server

Assisted users in troubleshooting their reports.

NOVOPOINT Inc., USA

Duration: May 01 - Aug 03.

Designation: Developer

Team size: 4

Environment: UNIX, Oracle 8i, Informatica Power center5.1, Business Objects, PL/SQL

Novopoint is a dynamic B2B e-marketplace that brings together all buyers and sellers of food and beverage ingredients and provides them with mission critical services and information. The company will serve participants of all sizes from every facet of the food industry, including buyers and suppliers of oils, sugars, colorings, packaging, chemicals, freight and nearly everything in between.

Novopoint will offer browser-based access for Internet hosted B2B procurement to enable companies of all sizes to buy and sell food ingredients and business and information services. The marketplace will offer supplier ennoblement, hosted and enterprises B2B procurement for goods and services, auctions, reverse auction, bid/ask exchanges, strategic souring, spot buying, customer specific pricing, electronic payment, logistic and community forums etc.,

Responsibilities:

Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager.

Developed reusable Mapplets and Transformations.

Created and Monitored Batches and Sessions using Informatica Power Center Server.

Involved in analyzing, coding and testing of Stored Procedures

Responsible to tune ETL procedures and schemas to optimize load and query Performance

Design the Universes in Business Objects.

Extensively used Business Objects Reports.

Education:

Master of Computer Applications



Contact this candidate