Post Job Free

Resume

Sign in

Data Manager

Location:
Los Angeles, CA
Posted:
November 04, 2017

Contact this candidate

Resume:

Pavan Kumar Reddy Etikela

+1-818-***-****

ac24x0@r.postjobfree.com

Summary:

• Around eight years of IT experience in developing, maintenance and enhancement projects in Data Integration, Data Migration and Data Warehouse technology solutions.

• Experience in Architecting, Designing and Building Enterprise Data Warehouses, Operational Data Stores and Data Marts.

• Involved in end to end Data Warehouse Life Cycle implementations. Worked on both Data Management and Reporting & Analytic initiatives

• Extensively worked on Data Warehouse environments with domain knowledge in Insurance, Banking and Tele-Communication.

• Informatica (Data Integration): 8 plus years of experience in Data Warehousing and Extraction, Transformation, Loading (ETL) of data from various sources into Data warehouse and Data Mart using Informatica Power Center 9.1/8.6/8.1/7.1.1.

• Data Modeling: 3 years of Dimensional Data Modeling experience on Data modeling, Erwin 4.5/4.0, Dimensional Modeling, Ralph Kimball & Bill Inmon Approaches, Star/Snowflake Modeling, DataMarts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling.

• Experienced in Performance tuning of Informatica (sources, mappings, targets and sessions)

• Database(Oracle, DB2): Seven plus years of Database experience using Oracle 11g/10g/9i; Experience in orking with databases like MS SQL Server 2000/2005, SQL, PL/SQL, SQL*Plus, SQL*Loader. Well versed in using TOAD, PLSQL Developer 3.0 interfaces

• Expert in Tuning Oracle SQL, PL/SQL Database Objects.

• Experienced in Repository Configuration/using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems into targets.

• Experienced in Installation, Configuration, and Administration of Informatica Power Center 9.x.,8.x

• Worked E2E in DWH (From installation of Informatica to Reporting using OBIEE 11g)

• Expert in the development of stored Procedures, Functions, Packages and Triggers.

• Experience in development of database objects like tables, indexes, Materialized Views, Procedures and synonyms.

• Experience in Query Optimization, Debugging, Tuning and improvising performance of the database, using the Oracle tools like Oracle Hints and Explain Plan etc.

• Strong PL/SQL programming skills and expertise with SQL tuning, performance tuning and Bulk binding techniques.

• Programming: Well experienced in UNIX shell scripting, PL/SQL.

• Expertise in preparing Functional & Technical Specifications besides User Guides.

• Experienced in Coordinating and leading off shore Development team.

• Expertise in Off-shore/On-site work culture, leading and mentoring Data Warehouse teams.

• Worked extensively on agile projects – multiple projects with release timeliness defined.

• Provided 24 x 7 supports for the projects during go-live and post production support activities.

• Excellent interpersonal and communication skills, technically competent and result- oriented with problem solving skills and ability to work independently and use sound judgement.

• Experience in documenting High Level Design, Low level Design, STM's, Unit test plan, Unit test cases and Deployment documents

• Successfully completed pilot project on Bigdata using Hadoop framework.

• Worked on HDFS with Hive,scoop,flume,spark (RDD),Spark SQL

• Knowledge on AutoSys,IMM as well.

Technical Expertise:

Tools

Reporting Tools

Informatica Power Center,Hive,Spark,Flume,IMM,Autosys OBIEE 11g

Databases,FileSystems Oracle 11g/10g/9i, DB2, MY SQL,SQL Server 2000/2005, HDFS,Hbase

Languages PL/SQL, Unix, Shell Scripting.

Utilities Putty, HP Quality Center,TOAD,SQL Developer, Winscp,Cisco Remedy,Dollar U

Operating Systems Unix, Linux, Windows 7, XP, NT,2000 Server/Professional Academic background:

• Bachelor of Technology,Sri Krishnadevaraya University, 2009 Experience Summary:

July 2014 – Till Date

Accenture Services Pvt Ltd, India/ Accenture LLP, USA Enterprise Financial Business Intelligence (EFBI). DWH/Informatica Architect

Description: Famers Insurance – Enterprise Finance Business Intelligence (EFBI) deals with Auto, Home and Umbrella policies. The source system for Farmers, to create Transactions for Home and Auto is FPPS and APPS applications respectively.

From these Applications data is loaded into DB2 tables and through ETL mappings this data is processed and loaded into Farmers Data warehouse build in the year 2006.EFBI comes into place once data is loaded into Data warehouse.

In Legacy system multiple downstream systems are producing similar information in different ways hence leading to:

• Difficulty in Data Reconciliation.

• Poor Quality of Data and Costlier IT environment. So, EFBI came into picture to introduce:

• Single source for Financial Data.

• ABC (Audit Balance Control) Mechanism.

• Quality data up to 99.99%, Consistent and Simplified data. Responsibilities:

• I Prepare technical design/specifications for data Extraction, Transformation and Loading

• Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

• Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.

• Data modeling of 3 business event/process in the datawarehouse, data mart and staging layers (with delta detection), Oracle E-Business Suite Module Involved:

• Integration of the data sources Salesforce, oracle, SQL Server through ODBC (SQL Server) and flat files

(external tables), OMB scripting, dimension modeling.

• Migration of the current environment, release management, repository data modeling against the databases

(Oracle, SQL Server) and Excel Files using data federation, level-based measures and AGO functions, usage tracking installation and dashboard design.

• Created reusable transformations to load data from operational data source to Data Warehouse and involved in capacity planning and storage of data.

• Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.

• Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.

• Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.

• Used Variables and Parameters in the mappings to pass the values between mappings and sessions.

• Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.

• Implemented restart strategy and error handling techniques to recover failed sessions.

• Used Unix Shell Scripts to automate pre-session and post-session processes.

• Did performance tuning to improve Data Extraction, Data process and Load time.

• Wrote complex SQL Queries involving multiple tables with joins.

• Implemented best practices as per the standards while designing technical documents and developing Informatica ETL process

Tools: Informatica PowerCenter 9.1, PL/SQL, DB2, UNIX, Shell Scripting, HP Quality Center, Toad, HDFS,Hive,Spark, Spark SQLs

Dec 2013 – July 2014

Accenture Services, India

Wells Fargo

IBCM (Investment Banking and Capital Marketing)

Informatica Technical Lead

Responsibilities:

• Salesforce, iCatch are our sources. We pull the data from multiple sources and load into Staging tables for reporting

• Developed all the objects (entities) existing in Salesforce in MySQL DB

• Thorough Testing and Reviews have been made

• Lead for Production Support Team

• Successfully made production releases and deployments every month

• Batch job monitoring (24x7) including Month End, Quarter End and Year-End Support.

• Integrated & Imported data from various Sources transformed, loaded into Data Warehouse Targets using Informatica

• Participate in performance tuning of jobs, which have shown degradation in performance or have started to impact SLAs.

• Involved in testing Phase of UAT (User Acceptance testing)

• Ensuring the quality of the deliverables

• Preparing and reviewing various project documents and sharing the same with the customers.

• Worked as POC for development activities like documentation to describe program development, logic, coding, testing, changes and corrections. Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.

• Used Variables and Parameters in the mappings to pass the values between mappings and sessions.

• Created Stored Procedures, Functions, Packages and Triggers using PL/SQL. Implemented restart strategy and error handling techniques to recover failed sessions

• Requirement Analysis, Impact analysis, Design and Development of Modules.

• Creating procedures, function, packages, ETL Mappings, Sessions, workflows and XML's.

• Performance Tuning for the queries which are taking lot of time to execute.

• Assign tasks to the offshore team, review the code.

• Creation of Test data and Testing.

• Coordinating with On-site counterparts.

• Responsible for all deliverables from offshore and onsite.

• DataMarts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling.

• Extensively used database objects like tables, Index, views, Materialized Views triggers, Packages, functions and procedure.

• Expertise in preparing Functional & Technical Specifications besides User Guides.

• Experienced in Coordinating and leading off shore Development team.

• Expertise in Off-shore/On-site work culture, leading and mentoring Data Warehouse teams. Environment: Informatica PowerCenter 9.1/8.6, Windows 7, MY SQL June 2012 – Dec 2013

Tata Consultancy Services,India

Reporting, BellTel CommunicationsBI Phillipines

ETL/ BI Reporting Developer

Description: Bell Tel Communications is one of the leading telecommunication services, which implemented 4G/LTE for the country of Philippines. Here BI Reporting Team is responsible from installation of ETL tools like Informatica, Reporting tools like OBIEE 11g and inbuilt reporting application OBIA. I got opportunity to work E2E of DWH from requirement gathering, installation of desired applications on HP AIX, working closely with business users to design Database which supports report generation in OBIEE11g. We worked with multiple teams like CRM (Customer Relation Management, OBRM (Oracle Billing Revenue Management), DMS (Domain Management System) to understand the complete system so that to build and design new Database with Facts and Dimensions which in turn used to generate reports in OBIE11g. We built around 40 reports from scratch in 6 months including the knowledge sessions to business users to generate adhoc reports. Responsibilities:

• Installation and Configuration of OBIEE 11g, Informatica 9.0.1 HF2, OBIA 7.9.6.3

• Taking regular rpd backups in OBIEE and repository backups in Informatica.

• Understanding the specification and analyze data according to client requirement.

• Gathering the requirements from several source components like CRM, BRM,DMS,Portal etc and designing the data Model to meet the customer required reports

• Developing SCD type 2 mappings for Dimensions

• Incremental load mappings for fact tables and summary tables

• Developing reports in analytics according to report requirements.

• Presenting reports to client and giving Knowledge Sharing sessions to Clients on adhoc reporting

• Provided the secure reporting to clients.

• Created and scheduled Sessions and Batches through the Informatica Server Manager.

• Worked with sessions and batches using Server Manager to load data into the target database.

• Testing for Data Integrity and Consistency

• Developed complex mappings in Informatica Power Center to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Joiner, Filter and Router.

• Created and scheduled Sessions and Batches through the Informatica Server Manager. Designed and documented validation rules, error handling and test strategy of ETL process

.

Environment: Informatica PowerCenter 9.1,OBIEE 11g, Oracle 11g,TOAD, HP AIX UNIX,OBIA 7.9.6.3 June 2009 – June 2012

Tata Consultancy Services,India

CISCO

Enterprise Data Business Intelligence(EDBI)

ETL/ Developer- Admin

Description: EDBI is vast application for CISCO which deals with reporting on one end and parallel migration of Database from Oracle to Teradata. This involves multiple teams like Deployment Group, Teradata support, Informatica Administrator, Oracle Configuration and Oracle support. I have worked in Informatica Admin and Oracle configuration teams. Infa admins are responsible for all admin activities like installation of Informatica, support to fix Informatica bugs, repository backups, load balancing on various nodes etc., whereas Oracle configuration is the team responsible for making the environment ready for production with all necessary releases and pre-production runs. Responsibilities:

• Scheduled ETL’s through cronjobs in LINUX.

• Automating all the possible manual procedures related to job monitoring

• Worked in Production Support

• Worked on Pre-Prod Environment Configurations (like Unix, Informatica)

• Monthly Releases

• Batch job Monitoring

• Extensively involved in writing ETL Specifications for Development and conversion projects.

• Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.

• Involved in requirement definition and analysis in support of Data Warehouse.

• Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, sequence generator, etc.

• Worked with XSD and XML files generation through ETL process.

• Defined and worked with mapping parameters and variables.

• Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.

• Performed the performance evaluation of the ETL for full load cycle.

• Checked Sessions and error logs to troubleshoot problems and also used de bugger for complex.

• Worked on Parameterize of all variables, connections at all levels in UNIX.

• Created test cases for unit testing and functional testing.

• Coordinated with testing team to make testing team understand Business and transformation rules being used throughout ETL process

• Environment: Informatica Power center 8.6.1, Oracle 10g, Unix, SQL, PL/SQL, Flat files, Cisco Remedy tool,Dollar U,

• Hadoop Framework – HDFS,Hive,Spark

Contact No : +1-818-***-****

Email Id : ac24x0@r.postjobfree.com

Visa Type : H1B



Contact this candidate