Post Job Free

Resume

Sign in

Manager Data

Location:
India
Posted:
June 25, 2015

Contact this candidate

Resume:

KRISHNA

acqe5h@r.postjobfree.com

+91-944*******

PROFESSIONAL SUMMARY

* .****** ** ** ********** in ETL development and implementation of Data warehouses, Data Migration, Data Cleansing, Decision Support Systems (DSS), Enterprise Business Intelligence and all phases of Data warehouse Development Life Cycle.

Worked extensively with all Informatica power center Designer tools including Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

Extensively Worked on Sessions/tasks, Worklets, and Workflows using Workflow Manager & Workflow Monitor.

Worked on Informatica power center repository manager for deploy the groups.

Good knowledge in Star Schemas, Snowflake Schemas, SCD, Surrogate Key & Dimensional Modeling.

Implemented Various Performance Tuning techniques on Mappings

Implemented Slowly Changing dimensions Type 1, Type 2, and Type 3 methodology for accessing the full history of accounts and transaction information.

Good working knowledge on Oracle and Unix.

Involved in Unit Testing and Data Validation for Developed Informatica Mappings.

Proven ability to quickly learn and apply new technologies and have creativity, innovation, and ability to work in a paced environment.

Team player with excellent communication and problem solving skills.

EDUCATION

Master of Computer Application (MCA) from JNTU.

WORKEXPERIENCE

Working as SSE for HCL technologies from July-2014 to till date.

Worked as Associate consultant for Polaris ft pvt ltd from jan-2014 to July-2014 date.

Worked as an ETL System Analyst for Tekplant June’11 to Jan-2014 .

Worked as an Associate Software Engineer in TCS From Aug-2009 to April-2011.

TECHNICAL SKILLS

ETL : Informatica Power Center 7.1/8.6/9.0.1/9.6

Databases : Oracle 10g,11g,SQL Server, My SQL.

Reporting Tool : OBIEE

Program ming : SQL, PL/SQL

Environments : UNIX and Windows

Tools : Sql Developer, SQL server.

ASSIGNMENTS:

Project#1.

Title : DPL Migration

Client : Microsoft

Environment : Informatica Power Center 9.6, Window8, Sql server.

Description:

Data Platform Layer is an enhancement to the existing velocity platform which will eliminate/fix the current deficiencies. This architecture is aiming for aligning with overall Data Strategy of Microsoft.DPL layer is being created as a part of Pilot and will be used for overall migration of EDW, Feed store and Portions of velocity platforms.

Intended to replace multiple acquisitions servers are to be integrated as one acquisition layer for Microsoft IT.

Provides ODS (operational data store) capabilities which are lacking today to support the data intensive businesses.

Designed for massive data volumes to manage high growth rates of data.

To maintain fail safe, high available data for downstream subscribers.

To resolve data latency issues faced by Feed store, EDW and velocity platforms.

Targeted to create a scalable, reliable, high volume, low latency platform.

Ability to provide full and delta feeds to subscribers with data sourced from DPL

to help onboard publishers as well as subscribers to a single platform.

Roles & Responsibilities:

Understanding the Technical Spec and Develop an ETL process to load data from source to target.

Involved in designing the mappings between sources and targets and also tuned them for better performance.

I have involved in the Different Datasets,Samiplanning and Nokiout.

Created Informatica Mappings to build business rules to load data using transformations like Source Qualifier, Aggregator Expression, Joiner, lookup, Filters, Router and Storedprocedure.

Performed various update strategies using Lookup and Update Strategy transformations and Done Unit Testing.

Involved in the deployment for QA and UAT.

Project #2

Title : Link Share-FEEDDB, DATA FEEDS

Client : LinkShare Corporation, USA

Environment : Informatica Power Center 9.5.1, Windows2003, Oracle10g, My SQL,

: SQL Developer, Flat files, UNIX.

Role :ETL Developer

Description:

Link Share is a leading provider of full-service online marketing solutions specializing in the areas of Search, Lead Generation and Affiliate Marketing. As the online marketing industry continues to evolve at a fast pace, advertisers and publishers are turning to Link Share for technology innovation, experience and a passion for driving results.

This Feed DB will be created to reduce database load on main db. It will contribute to the system performance and it increase uptime of main db .We can separate the traffic from main db because they don't need real time transactional data.

Responsibilities:-

Understanding the Business Requirements and Develop an ETL process to load data from source to target.

Worked with Source Analyzer, Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse.

Coordinated with DBA team for tuning sources and targets and calculating table space and growth of database.

Involved in ETL production deployment.

Worked on informatica power center repository manager for deploy the groups.

Involved in Unit Testing and prepared Test Cases.

Project #3

Title : Link Share-EDW-DM (DLF)

Client : LinkShare Corporation, USA

Environment : Informatica Power Center 8.6.1, Windows2003, Oracle10g, My SQL.

Role :ETL Developer

Description:

LinkShare is a leading provider of full-service online marketing solutions specializing in the areas of Search,Lead Generation and Affiliate Marketing. As the online marketing industry continues to evolve at a fast pace, advertisers and publishers are turning to LinkShare for technology innovation, experience and a passion for driving results. In this Data migration Project we are migrating the data from IBMDB2 to Oracle Database. At the Intial level, load the data from OLTP Systems to ODS and finally loading data from ODStoDateWarehouse Databse.

Responsibilities:

Understanding the Technical Spec and Develop an ETL process to load data from source to target.

Involved in designing the mappings between sources and targets and also tuned them for better performance.

Created Informatica Mappings to build business rules to load data using transformations like Source Qualifier, Aggregator Expression, Joiner, lookup, Filters, Router and update strategy.

Performed various update strategies using Lookup and Update Strategy transformations

Done Unit Testing and prepared the parameter file for creating the connection’s in UNIX.

Involved in Production deployments, Data purging etc.

Worked on admin console for make shutdown\start up integration services.

Project#4

Title : JCPenney Retail

Client : J.C Penney Company, Inc., USA

Environment : Informatica Power Center 8.6. Windows2003, Oracle10g, My SQL,

: SQL Developer, Flat files, UNIX, OBIEE.

Role : ETL Developer

Description:

JCPenney is the Sales Order Management Application is responsible to create, validate and process all orders that are created by various methods for customer driven as well as batch orders. The SOMA Reporting Application is mainly for generating the Reports. The feeds for these Reports are the Files generated from the DODS stage1 Database and the 4 Flat files from the COES with the help of the JAVA Listener.

Responsibilities:

Understanding of various Business requirements.

Developed the Informatica mappings to load the data into staging tables from multiple data sources such as SQL Server, Oracle.

Developed Mappings using various transformations depending upon requirement to implement business logic and also tuned them for better performance

Interaction with Business people and SOMA’s for sorting out the queries.

Involved in developing Procedures and Functions for data extracting from staging to CDW database.

Prepared ETL technical documentation.



Contact this candidate