Post Job Free

Resume

Sign in

BI Consultant

Location:
Ontario, CA
Salary:
60Kper year
Posted:
June 15, 2016

Contact this candidate

Resume:

Devi Palani

BI Consultant

fd40sp@r.postjobfree.com

SUMMARY

7+ years of experience in Analysis, Development, and Implementation of business applications including Development and Design of ETL methodologies using Informatica Power Center in various domains like Banking, HR, Retail and Entertainment sectors.

5+ years of experience in Business Intelligence Tools, Business Objects(BOBJ) 4.0/XIR3/XIR2 used different tools like Universe Designer, Web Intelligence, Rich client, Info view, Crystal Reports, Xcelsius Dashboard and also the new Information Design Tool (IDT) in 4.0.

Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.

Experience in Performance Tuning of sources, mappings, targets and sessions. Ability to understand available technology, tools and existing designs and work on complex problems where Analysis of situations and data gathering is required.

Designed Dimensional data model by identifying business process, Grain and identifying Source and Targets for the project.

Extracted Sources and targets using source analyzer and warehouse designer and created mappings in designer using different kinds of transformation according to business logic.

Created and scheduled workflows and worklets, configured e-mail notifications using workflow manager.

Identified bottlenecks and tuned targets, sources, sessions, mappings for better performance of mappings and sessions.

Strong Logical and Physical Data Modeling and Dimensional modeling skills in Physical database design, Relational Models, Star Schemas, and Snowflake Schemas.

Involved in complete lifecycle of a project including writing test scripts and performing unit-testing by writing SQL against the database and validating with end users reports.

Documented ETL Architecture, HLD & LLD design and ordered dependency tables, mappings & jobs.

Experience in building universes, Retrieving Data using Universes, Personal data files (MS-Excel), stored procedures and with Multiple Fact Tables using Universe Designer and Information Design Tool, resolving Loops and Traps, Creation of Measure and Dimension Objects, Join conditions, Filters, LOV’s, designing complex objects using @variable, @prompt,@ Aggregate aware functions as per the universe best practices.

Creating Xcelsius Visualizations by fetching the data using Query As A Web Service (QAAWS), Live Office into Xcelsius Dashboard from different source systems.

Developed Reports using cascading prompts, grouping, sorting, hierarchy structures, charts, custom functions and variables.

Experience in End-to-End Database Development, Performance Tuning, Optimization of SQL scripts and Production Support.

Excellent problem solving skills with good interpersonal skills, Quick learner and excellent team player.

TECHNICAL SKILLS

ETL Tools : Informatica power center V 8.x,9.5.1 & 9.6.1

Report Tool : SAP Business Objects 4.0/XI R3.1/R3.0, Web Intelligence,

Crystal Reports 2008, Xcelsius 2008/2010 & Dashboard.

Data Modeling : Star-Schema Modeling, Snowflake Modeling, FACT and

Dimension Tables.

Databases : Oracle 8i/9i/10g, MS SQL Server, Netezza.

Applications : MS Word, Excel, Outlook, MS Project, MS Visio & PowerPoint.

PROFESSIONAL EXPERIENCE

Time Warner Corporate, New York, NY May’15 to Aug’15

BI Consultant

(TWX Employee Agreement)

Description:

Time Warner is an American Multinational Media corporation headquartered in the Time Warner Center in New York City. The project provides HR Ops with the opportunity to have an enterprise-wide contracts database that will enable them to better examine and manage contractual agreements. The schema populated with multi divisional employee agreements on weekly frequency. The SAP BO report provides confidential contractual agreement data of HBO, TBS, WB and CORP.

Responsibilities:

Interact with functional and technical representatives of project teams in order to understand business functionalities, technical modules, integration mechanism and data sources.

Preparing design documents, building proof of concepts, validating design solution and data modeling.

Preparing detailed technical specification and mapping doc for both ETL code & BO reports.

Lead a team of 2 SAP BO and ETL members to achieve success in development and deployment activities.

Designed & Developed ETL code to bring the employee data from multiple divisions like HBO, WB & TBS.

Implemented slowly changing dimensions according to the requirements, Partitioned sessions, modified cache/buffers and tuned transformations for better performance.

Created utility programs for file manipulations, pre-session, post-session using Unix scripts and also used crosstab for scheduling workflows.

Business Object Universe, Business Objects Reports and Business Objects repository were used extensively.

Created Derived tables in the Designer to enhance performance of the universe.

Created Aliases and Context to resolve loops, defined the cardinalities as defined in data model.

Created Universe level objects and measures using the business rules provided.

Resolved issues related to RDBMS flaws such as Fan Trap, Chasm Trap using BO multi parse SQL functionality.

Closely work with Database Administrators to validate data process for reporting needs.

Created Hierarchies according to the business rules for viewing data at selected level of detail.

Developed multiple canned/ad-hoc Equity & Agreement SAP BO Webi Reports to the HR Ops team and also scheduled few reports based on Business requirement.

Published reports to different repositories based on their origin of development.

Prepared test plans and conducted Unit, Integration, Regression, Performance and Functional Testing.

Prepared & provided the KT with the detail project document to the production support team.

Time Warner Corporate, New York, NY Oct’14 to April’15

BI Consultant

(Global Real Estate Space Management – ARCHIBUS & HARBOR FLEX Tool)

Description:

The project replaces the system integrations between Aperture and ancillary solutions at TWC and HBO with Archibus. Export people population from TWC, HBO and WB to the Archibus and assign the space in Archibus Web based Tool and import the people space from Archibus and send the locations to respective division. Also generate the Archibus report to the business.

Responsibilities:

Interacting with business users and business analyst to understand reporting requirements and prepared system requirement specification document.

Lead a team of 3 SAP BO and ETL team members to achieve success in development and deployment activities.

Gather 3 division employee information & load to Archibus through ETL & .NET jobs.

Developed code to integrate multiple division (HBO,WB & TWC) with the Archibus Web central cloud based Web service.

Read people assignment from Archibus Web Service as an xml source file.

Process the XML file & populate the data in schema using Lookup transformation, Router transformation, Web service source qualifier,etc.,

Developed an ETL coding to send the Employee location to the respective divisions.

Developed Universes using Business Objects IDT -Universe Methodology involving the creation of underlying database structure, resolving the loops, creating classes & objects, conditions & joins and finally testing the integrity of the Universe.

Created different types of Space metric reports with Master/Detail, Cross Tab and Chart (for trend analysis). Used filters, conditions, list of values (LOV), calculations, etc.

Developed number of reports, validated reports developed by team, assisted in fixing the issues. Written complex queries at the backend.

Generated the Space & people details BO Webi reports with Occupancy & Vacancy information from Archibus.

Developed SAP BO Report to generate a space metrics to NKF visions.

Developed canned/ad-hoc reports with rental data from the Harbor Flex Webservice based tool & also scheduled multiple reports based on business requirements.

Exported the Universes to the Repository to make resources available to the users. Tested reports for data consistency, tested Universe table joins, contexts, defects in reports/fixed the defects in reports. Involved in resolving the loops, cardinalities, contexts and checked the Integrity of the Universes.

Created Test cases for Unit Test, System Integration Test and UAT to check the data quality.

Provided training for end users for generating their own Webi reports.

Time Warner Corporate, New York, NY April’14 to Oct’14

Informatica Consultant

(HR Payroll Transition WORKDAY - PEOPLESOFT)

Description

The Project focuses on transforming the corporate data from Wrkday to PeopleSoft by migrating Historical data (Conversion component) and leveraging periodic ETL routines (Integration component) for ongoing transactions. The project has multiple inbound and outbound modules like HR Portal (Consolidate information from legacy eC (Employee Connection), other sources like HRDC and Data Hub to create user and application roles in Keystone), PSOFT PROD(Time Warner Corporate’s data store for people soft. It holds all people soft related details about employees which is provided by Data Hub. This information is refreshed on daily basis from Data Hub) and Active Directory (IT Operations and Engineering Team to create security profiles for Employees).

Prepared an system requirement & ETL Specification(Transformation Rules, Mapping and Workflow Specifications) document by interacting with business users and business analyst.

Created extensive mapping/mapplet, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, worklets etc.

Developed mappings with Web service consumer transformation and Web service providers for HR portal.

Generated XML files in the pipeline using XML generator transformation and also used XML parser to extract the data from the XML source.

Used Transactional Control transformation to generate multiple target files dynamically.

Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.

Optimized the existing applications at the mapping level, session level and database level for a better performance.

Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.

Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the activities.

Performed ETL testing using DVO(Data Validation Option Tool) and also created Unit test plan test the mappings and created the test data.

Developed SQL and Lookup views, functions to facilitate specific requirement in DVO.

Environment: Informatica9.5.1PowerCenter, Oracle 10g,11g, SQL server(Aqua Data Studio), Netezza Database, Informatica DVO tool, SAP BO4.0, MS-Project, MS-Visio, UNIX Shell Scripting.

Comcast NBC Universal, NY Mar’12–April’14 Sr. Informatica Developer

Description

Comcast-NBC Universal is one of the world's leading media and entertainment companies in the development, production, and marketing of entertainment, news, and information to a global audience. This project mainly focused on building a commercial data warehouse platform for Ad sales which will be used efficiently in all the business analysis and decision making process, across all NBC portfolio. Currently portfolio reporting is not possible because each properties resides on different platforms and it involves lot of manual process and its own challenges which is time consuming and that leads to delay in decision making process. With the help of new data warehouse this is possible by just clicking a report from the commercial data warehouse.

Responsibilities:

Interacted with business users and business analyst to understand reporting requirements and prepared system requirement specification document.

Worked with Reporting team to get to an agreement on what level of complexity to be handled by ETL and reporting side.

Experience in developing canned/ad-hoc reports, scheduling the processes using Broadcast Agent and administering the BO activities.

Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more.

Implemented slowly changing dimensions according to the requirements, Partitioned sessions, modified cache/buffers and tuned transformations for better performance.

Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.

Developed shell scripts for Informatica pre-session, post-session scripts.

Optimized the existing applications at the mapping level, session level and database level for a better performance.

Developed PL/SQL procedures, functions to facilitate specific requirement.

Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.

Carried out unit testing, systems testing and post-production verification.

Investigated and fixed problems encountered in the production environment on a day to day basis.

Environment: Informatica Power Center 9.1/ 8.6.1, Power exchange, Erwin, SAP BI4.0, Business Objects XI R3, Crystal reports XIR3, Web intelligence, Oracle 9i,10g, MS SQL Server 2008, Flat Files, UNIX Server.

24X7 Real Media New York, NY Sep’11 to Mar’12

ETL Developer

NGDR (New Global Daily Revenue System)

The ETL of NGDR was responsible for updating two databases. While one database was used for calculations and capturing user inputs on revenue the other was used for reporting and reconciliation. The purpose of the NGDR data warehouse is to capture the daily revenue data of 24X7 Real Media and load it properly in the warehouse for further reporting. The project had two main units, ETL via Informatica and reporting via Business Objects. The scope of this project was to deliver the following reporting capabilities: -

Daily revenue data with drill down/roll up capabilities i.e. YTD, MTD and Daily reporting capabilities, Sector wise reporting, Trading Unit wise reporting, determining the performance at Legal Entity level etc.

Responsibilities:

Designed the Data Warehouse (logical and physical models), designed and developed the dimensional data models (star schema), and supported the UAT and production phases.

Created the Transformation Rules, Mapping Specifications and Workflow Specifications.

Involved in doing the Impact Analysis of the changes done to the existing mappings and providing the estimations.

Created the design specs for loading the data from OLTP to STG Schema, STG schema to DWH Schema.

Developed and maintained ETL mappings using Informatica Designer 8.1 to extract the data from multiple source systems which comprise of Oracle 9i,10g, SQL Server, flat files to multiple targets like Oracle and Flat files.

Migrated mappings from Development Repository to Test Repository and also from Test Repository to Production Repository.

Created extensive mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, worklets etc.

Extensively worked on tuning and improving the Performance of Mappings.

Performed ETL testing, created Unit test plan and Integration test plan to test the mappings and created the test data.

Environment: Informatica8.5.1PowerCenter, Oracle 9i,10g, SQL server2008,UNIX Server.

Citi Groups, Florida Sep’08 – Jun’11

Software Engineer

Description

Citi Cards, headquartered in Jacksonville, Florida, is part of Citi group, which is the largest bank in the world. Citi Cards is the largest issuer of Visa, MasterCard and American Express and has 300,000 employees managing more than 200 million customers in 105 countries. Citigroup is the first US bank with more than $1 trillion in assets, Citigroup and its myriad subsidiaries offer deposits and loans (mainly through Citibank), credit cards, investment banking, brokerage, and a host of other retail and corporate financial services.

Responsibilities:

Proficiently managed for ETL development using the Informatica power center module 8.6.1.

Worked with Source Analyzer, Data Mappings, Transformations, Informatica Repository Manager, Workflow Manager and Monitor.

Used most of the transformations such as the Connected, unconnected lookup, update strategy, Router, Filter, Sequence Generator, Stored Procedure and Expression as per the business requirement.

Created the Email task notifications in the workflow manager and also used the command task to run the batch scripts to execute the different applications.

Setup Workflow & Tasks to schedule the loads at required frequency using Workflow Manager.

Done extensive testing and wrote queries in SQL to ensure the loading of the data.

Used Microsoft Integration SSIS to perform several transformations and to test informatica workflow in system level.

Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.

Extensively used List box, Bar chart, Tab set, dynamic visibility.

Published the Dashboard onto the corporate share drive, which was eventually accessed seamlessly by the end users.

Created Complex Reports using Drill down, Cross tab and Charts (time series graphs, column, line, pie). Used the drill functionality, ranking, sorting, complex alerter and complex variables in full client reports.

Created different types of reports, like Master/Detail, Cross Tab and Chart (for trend analysis). Used filters, conditions, list of values (LOV), calculations, etc.

Created the Universes with Default & Custom Hierarchies for multidimensional analysis and created reports with Multi-Level Scope of Analysis using drill down, drill across, drill up and slice and dice

Worked on Info-view to develop various reports having linking feature and data at various levels.

Developed number of reports, validated reports developed by team, assisted in fixing the issues. Written complex queries at the backend

Exported the Universes to the Repository to make resources available to the users. Tested reports for data consistency, tested Universe table joins, contexts, defects in reports/fixed the defects in reports. Involved in resolving the loops, cardinalities, contexts and checked the Integrity of the Universes.

Provided training for end users for generating their own webi reports.

Testing - Unit testing

Environment: Informatica Power Center 8.5/8.1,Business Objects XI R3, Crystal reports XI R3, Web Intelligence, Oracle 9i, MS SQL Server 2005, Flat Files, Windows XP, Sql*Loader, TOAD.

EDUCATION

Master’s in Engineering from Anna University, Chennai, INDIA.



Contact this candidate