Resume

Sign in

Data Manager

Location:
Las Vegas, Nevada, United States
Salary:
100000
Posted:
November 13, 2017

Contact this candidate

PROFESSIONAL EXPERIENCE

Over all ** years of IT experience in ETL, Technical Lead, Data Analyst and worked on various data migration, data warehousing and database driven projects.

Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.

Strong ETL experience of using Informatica 9.5/9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.

Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/DataMart Design, ETL, OLAP, BI, Client/Server applications.

Strong experience working with ETL tools Informatica/OWB.

Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.

Strong experience in Reporting and Analytical tools like Business Objects, Siebel Analytics OBIEE, Cognos.

Utilized AUTOTRACE and EXPLAIN PLAN for monitoring the SQL query performance.

Good intercommunication skills to work with all levels of management.

Education

B. Tech from GITAM College (Andhra University), Visakhapatnam, India

TECHNICAL SKILLS SET

RDBMS/Data Access

Oracle, Teradata, MS SQL Server 2000, MS Access

Data Warehouse Tools

Informatica 5/6.1/6.2/8.0.1/8.6/9.1/9.5,OWB(Oracle Warehouse Builder)

Programming Languages

SQL, PL-SQl, BTEQ, Unix Shell Scripts, Batch Scripts

Reporting Tools

Cognos, Business Objects, Crystal Report, SAP BW, Spot Fire

Data Modeling

Erwin, Power Designer, Tibco, Toad

PROFESSIONAL EXPERIENCE

Wipro Technologies Jan 2013 -Till Date

Client: NV Energy, Nevada, ETL & Report lead Apr 2017 -Present

Define the XML target files required by the Maximo system.

Create the data objects and load data into the Maximo system.

Worked with the blue print of CU.

Created the data transformation, mapping and workflow for migration of CU through informatica 9.1.1

Design & develop detail ETL specification based on the business requirement.

Translate business needs into the technical solutions by working with business and IT stakeholders.

Assist in the effort estimation and planning for SSAS & SSRS implementations.

Development of MSBI reports on large amount of data.

Client: Wells Fargo, Baltimore, Informatica Sr Developer Sep2016–Mar 2017

Lead design, development and implementation of the ETL projects end to end.

Develop solution in highly demanding environment and provide hands on guidance to team members.

Worked with different Informatica transformations like Aggregator, Lookup, Joiner,Filter,Router,Update strategy, Union,Normaliser,SQL in ETL development.

Worked with Push down optimization to improve performance.

Created pre-session, post session, pre-sql,post sql commands in Informatica.

Worked with Event wait and event raise tasks,mapplets,reusable transformations.

Worked with Parameter file for ease of use in connections across Dev/QA/Prod environments

Implementing an Informatica based ETL solution fulfilling stringent performance requirements.

Conducted impact assessment and determine size of effort based on requirements.

Developed Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, T-SQL.

Created transaction data objects to come up with source to target mapping and data conversion design.

Worked in all the design/develop artefacts and gave a detail visibility to client lead in terms of data delivery of the blueprint program.

Client: Sydney Water Hyd(Ind),Informatica Sr Developer Jul 14 - Sep 16

Lead design, development and implementation of the ETL projects end to end.

Developed Informatica workflows/worklets/sessions associated with the mappings across various source system like Maximo, FRM, FMIS, CMS, MS access, Customer Billing etc.

Worked with cleanse, parse, standardization, validation.

Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings across Dev, QA, and PROD environments

Developing system modification specifications; mapping data; establishing interfaces; developing and modifying functions, programs, routines, and stored procedures to export, transform, and load data, meeting performance parameters, resolving and escalating integration issues.

Validates data integration by developing and executing test plans and scenarios including data design, tool design and data extract/transform.

Client: Novartis Hyd(Ind),Informatica Lead Feb 13 to Jun 14

Worked in the integration of 16 registries.

Worked with the functional people to understand the clinadmin and design the integration of clinadmin with Isearch data model.

Responsible for ETL technical design discussions and prepared ETL high level technical design document.

Created the architecture of iSearch Integrated Data Repository by bringing the normalised data into single format.

Defined the ETL logic, data cleansing and harmonization rules for the 16 different sources.

Lead team of ETL (PL/SQL) and reporting developers and coordinate onsite/offshore activities.

Created batch script which will run the store procedure from the windows scheduler.

Created Data Mapping between Source and Target Systems

Intensively worked in the new report tool SPOTFIRE to bring the functionality of NGIS,NGSS and NGCA.

Define the architecture of the report for NGIS, NGSS and NGCA.

Documentation of ETL process for each flow.

Accenture Mar 2010 -Dec 2012

Client: Amazon Hyd(Ind), Techno Functional Lead Feb 2012 to Dec 2012

Lead team of ETL and reporting developers and coordinate onsite/offshore activities.

Initial understanding of existing system and develop new system to enhance Fate CN/JP into DWH.

Performance tuning of mapping and oracle query.

SME for informatica and DAC for USL project.

Created unix generic script which will run the workflow.

Worked on the Data net(Client specific tool) ETL tool with in stipulation time and deliver the USL(Four way recon module) project.

Client: Bank of America Hyd(Ind),Tech Lead Feb 2011 to Jan 2012

Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementing Data Modelling changes

Involved in Data Profiling and Analysis.

Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical data model.

Performed Data Gap Analysis in current data warehouse.

Created Data Mapping between Source and Target Systems.

Worked closely with the source system (ECOMM, Infinity, BASIS, Cash vault, Cost Center) to integrate into the BIDW.

Worked with the functional people to map the source system data.

Involved in OBIEE reporting for preparation of repository.

DataCore Service Ltd Jun 2009 -Mar 2010

Client: Telia Sonera Chennai(Ind), Consultant Jun 2009 -Mar 2010

Interacted with Business Analyst to understand the business requirements.

Designed the ETL mapping with the data flow from various sources to target and also implemented the SCD.

Developed Test Plan, Test Cases and Test Scripts for with complete understanding of each domain functionality.

Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.

Created the mapping to build the logic for the suspend handling and the error handling.

Involved in successful development, testing, deployment and maintenance of edw3&6.

Created and executed UNIX scripts as pre/post-session commands to ftp the file from landing area to informatica work area and also to archive the files after loading the data.

Created and executed PL/SQL packages to handle the delta handling, exchange partition and metadata information.

Worked in the business objects to build the universe.

TCS Jul 2004-Apr 2009

Client: TCL (Tata Capital Ltd),IT Analyst May 2008-Apr 2009

Worked on the Architecture of BW for creating the different dimensions and facts.

Worked closely with the source system (Orbit, SAP LAN and Bancs) to integrate into the Enterprise reporting system.

Worked with the functional people to map the source system data.

Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.

Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.

Developed and executed UNIX scripts as pre/post-session commands to schedule loads.

Created the Universe and Business views in the business objects and crystal reports respectively

Developed Test Plan, Test Cases and Test Scripts for with complete understanding of each domain functionality.

Given the architect solution to the user for the MIS and BW reports.

Client: Swiss Re, Kansas(US) Sr. Developer Apr 2006-Apr 2008

Worked closely with DBA for enhancing the Global Database in Oracle.

Worked closely with the source system (PS Asia and PS Zurich) to bring the source system into the SAP.

Worked closely with SAP team to test the PA, OM data in SAP .

Worked with the functional people to map the source system data with the SAP.

Developed the mappings and transformation to bring the big source five system into the SAP.

Designed the gap analysis for the SAP BW reporting.

Developed Test Plan, Test Cases and Test Scripts for the hire, rehire and leavers.

Involved in Manual Testing of the application.

Prepare the ETL Transformation Specifications for business logics.

Used Teradata utilities fastload, multiload, tpump to load data from various source systems.

Developed scripts in BTEQ to import and export the data.

Preparing MD120, UTP (Unit Test Plans) documents.

Worked on Informatica - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet and Transformations.

Client: GEHC Mumbai, Developer Jul 2004-Mar 2006

Design ETL mappings by using Informatica.

Prepare the ETL Transformation Specifications for business logics.

Preparing MD120, UTP (Unit Test Plans) documents.

Testing the report in Cognos with the Source

Worked on Informatica - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet and Transformations.

Analyze the existing mappings and prepare Business Logics Documents for existing mappings.

Performance tuning of different mappings

Used Teradata utilities fastload, multiload, tpump to load data from various source systems.

Developed scripts in BTEQ to import and export the data.



Contact this candidate