Post Job Free

Resume

Sign in

Data Warehouse

Location:
California
Posted:
June 26, 2015

Contact this candidate

Resume:

SURESH GURRAPU

Email: acqf64@r.postjobfree.com Contact number: 770-***-****

PROFESSIONAL SUMMARY:

8+ years of IT experience out of which 7 years of experience in Analysis, Design, Development, Maintenance and Testing BI and Data warehousing Projects.

Extensive experience in Data Warehousing and implementing ETL (Extraction Transformation Loading) using Informatica Power Center 9.1/9.0/8.6/8.1/7.2/7.1

With Strong Knowledge in all phases of Software Development Life Cycle(SDLC) such as developing, testing, Maintenance of various platforms like UNIX, Windows NT/2000/XP/98

Functional and Technical expertise in Decision Support Systems – Data Warehousing, ETL (Extract Transform and Load) using Informatica

Developed Complex mappings from varied transformation logic like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more in Informatica.

Expertise with ETL (Extract Transform Load)/Administration using Informatica.

Well versed with architectures of Informatica Power Center 8.x& 9.x

Experience with Informatica and Teradata utilities in Extraction, Transformation and Loading (ETL).

Exposure to ERWIN as a Data Modeling Tool.

Knowledge of Data warehousing concepts and Dimensional modeling (Star schema, Snowflake schema).

Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling

Good understanding of dimensional modeling and relational database management systems.

Experience in Teradata with TPT, Fast Load, Multi Load, BTEQ and Macros.

Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process.

Experience with TOAD to analyze the tables, create indexes and import data from different schemas.

Conducted Functionality, Integration, System and UAT testing and investigated software bugs.

Worked on UNIX shell scripts used in scheduling Informatica pre/post session operations.

Scheduled jobs and automated the workload using the Tidal Job Scheduler.

Proficient with wide variety of sources like Oracle, Teradata and Flat Files

Proficient in coding Oracle and SQL Server Stored Procedures, Triggers, Cursors and Indexes.

Strong interpersonal and excellent communication skills.

EDUCATION/CERTIFICATIONS:

B.Tech, Nagarjuna University, Vijayawada India, 2003

TECHNICAL SKILLS:

ETL Tools

Informatica9.x/8.x/7.x ( Power Center), MDM 9.1, IDQ 9.6, B2B,

Data transformations

Reporting tools

OBIEE 10.x

Languages

C, SQL, PL/SQL, C++, Shell Scripting.

Databases

Oracle 11g/10g/9i,Teradata, Netezza 7.x

Operating System

Windows 9x/NT/2000/XP, UNIX

Data Modeling Tools

Erwin 4.x

Scheduling tools

TIDAL

Office Applications

MS-Office 97/2000/XP

Other Tools

TOAD 8.5 and SQL Plus (Database tool).

PROFESSIONAL EXPERIENCE:

Client: CBS, Los Angeles, CA Aug 2014 – Present

Role: ETL/ Informatica Technical Lead

Description: CBS Corporation is an American mass media corporation focused on commercial broadcasting, publishing, billboards and televisionproduction, with most of its operations in the United States. CBS was established in 1928, when founder William Paley purchased 16 independent radio stations and christened them the Columbia Broadcast System. Today, with more than 200 television stations and affiliates reaching virtually every home in the United States, CBS's total network lineup was watched by more than 130 million people a week during the 2012/2013 season. As a typical Data Warehouse, the project is been classified into two streams i.e. ETL (Extraction Transformation and Loading) using Informatica and Analytical reporting using Cognos.

Responsibilities:

Involved in analyzing the requirements, designing and development of stage, target layers.

Prepared HLD and ETL Specs based on mapping documents.

Developing the code based on the Specs provided by Data Architects.

Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs

Developed code and executed UTCs.

Developed BTEQ, Fast load scripts, macros and Partitions.

Conducting weekly status meetings to discuss about the progress in that week.

Creating job setup document for job scheduling tool with all the properties.

Implemented performance tuning techniques using partitions, macros, etc.

Implemented Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.

Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.

Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Transformed bulk amount of data from various sources to Teradata database by using BTEQ scripts.

Environment: Informatica Power Center 9.1, Teradata 13.X, Flat Files, UNIX, Windows XP.

L&T INFOTECH LTD. Feb 2013 – Jul 2014

Client: HBO, New York,

Role: ETL/ Informatica Technical Lead

Description: HBO (Home Box Office) is an Americanpremiumcable and satellite televisionnetwork that is owned by Home Box Office Inc., an operating subsidiary of Time Warner. HBO's programming consists primarily of theatrically released motion pictures and original television series, along with made-for-cable movies and documentaries, boxing matches and occasional stand-up comedy and concertspecials. HBO is the oldest and longest continuously operating pay television service (basic or premium) in the United States, having been in operation since November 8, 1972.As a typical Data Warehouse, the project is been classified into two streams i.e. ETL (Extraction Transformation and Loading) using Informatica and Analytical reporting using BO. It has various modules like ADI, MRI-CDB, Job Posting, RDS ASSET, EST Media Morph, DMONonLinear etc.

Responsibilities:

Involved in analyzing the requirements, designing and development of data warehouse environment.

Preparing TSD and ETL Specs based on BSD and Mapping document.

Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs

Developed code and executed UTCs.

Conducting weekly status meetings to discuss about the progress in that week.

Creating job setupdocument for job scheduling tool.

Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.

Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.

Used various Transformationslike Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.

Extracted data from Flat files, oracle andloaded them into Oracle and Netezza.

Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.

Worked with complex queries for data validation and reporting using SQL and PL/SQL.

Transformed bulk amount of data from various sources to Netezza database from flat files.

Written the scripts needed for the business specifications (BTEQ Scripts).

Involved in Performance Tuningusing partitionsfor sessions and relational database tables.

Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.

Environment: Informatica Power Center 9.1, Netezza7.x, Oracle 11g/9i, flat files, UNIX, Tidal

Accenture Services Pvt Ltd. Apr 2011 – Feb 2013

Client: GSK, UK

Role: ETL/ Technical Lead

Description: GSK is the one of the major Pharmaceutical Company spanned across US, Europe, etc. The GSK pharmaceutical company develops, produces, and markets drugs licensed for use as medications. GSK pharmaceutical companies also deal with generic and brand medications. In general, GSK pharmaceutical products can be described as medicines and vaccines for human and animal use. These may be prescription or over-the-counter. GSK Pharmaceutical Company is subject to a variety of laws and regulations regarding the patenting, testing and marking of drugs. GSK pharmaceutical Company includes the research, development, manufacturing, distribution and regulation of products and services related to drugs and their components. As a typical Data Warehouse, BATCH ONE is been classified into two streams i.e. ETL (Extraction Transformation and Loading) using Informatica and Analytical reporting using SAS

Responsibilities:

Involved in analyzing the requirements, designing and development of data warehouse environment.

Preparing TSD and ETL Specs based on BSD and Mapping document.

Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs

Developed code and executed UTCs.

Conducted walk through and created deployment documents, run before moving to SIT, UAT and Production.

Conducting weekly status meetings to discuss about the progress in that week.

Creating job setupdocument for job scheduling tool.

Given Technical clarifications and assigned work to off-shore team and monitored their work status da

Transformed bulk amount of data from various sources to Teradata database by using BTEQ scripts.

Written the scripts needed for the business specifications (BTEQ Scripts).

Transferred data using Teradata utilities like SQL Assistant, Fast Load, MLoad.

Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.

Worked on Tidal to automate the Workflows.

Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Monitor performance of workflows, sessions and suggest any performance tuning improvements after consulting with WellPoint team

Provide support for development, testing and implementation for technology upgrade projects (hardware and software)

Involved in knowledge transitions from Development teams and Lights On team regarding changes to existing applications in Production.

Environment: Informatica Power Center 9.0.1, Teradata 13.x, MDM, Oracle 11g/9i, flat files, UNIX, Windows XP

Hexaware Technologies Ltd Oct 2006 - Mar- 2011

Client: IMS, UK

Role: ETL/Informatica Developer

Description: IMS (Intercontinental Medical Statistics) Health is an international company that supplies the pharmaceutical industry with sales data and consulting services. IMS today is recognized as the leader in its field - the world’s leading provider of market intelligence to the pharmaceutical and healthcare industries. IMS Enterprise Warehouse (IMS EW) is a Pharmaceutical Technology Services wide initiative to track the performance metrics across the globe. IMS EW tracks data from various source systems like SFDC (Sales Force Automation), SAP. To derive the complex metrics used to measure the performance of each site. As a typical Data Warehouse IMS 3EW is been classified into two streams i.e. ETL (Extraction Transformation and Loading) using Informatica and Analytical reporting using OBIEE. IMS EW would help the business users to track the performance of each and every site by ranking the sites among themselves.

Responsibilities:

Involved in the Requirements gathering, Analysis and implementation of Data Warehousing efforts.

Prepared Transformation Specifications for each and every mapping.

ETL Implementation for designing.

Extensively used Informatica to load data from different data sources into the oracle data warehouse.

Imported Sources and Targets to create Mappings and developed Transformations using Designer.

Created mappings and transformations using joiner, expression, aggregator, filter, router, sorter, sequence generator, update strategy and lookup.

Implemented performance tuning in oracle queries by using different techniques

Good experience with performance tuning in oracle using indexes like bitmap and B tree indexes.

Preparing test cases for unit test.

Raised the required requests to migrate the code from different environments and providing the required documentation and DB scripts to the concerned teams

Coordinated with Developers, Testing team and DBA for the System testing and Performance testing.

Was always available & accessible to team members in understanding the warehouse and for technical issues.

Environment: Informatica 8.6.1, Oracle 9i, OBIEE 10.1.3.4 and Unix

Client: Dexia, Belgium

Role: ETL/Informatica Consultant

Description: Dexia is a European bank which concentrates its activity in public sector banking, offering complete banking and financial solutions to local public sector operators, and in retail and commercial banking. This project is involved in the development of data warehouse for DEXIA based on four kinds of data marts such as Accounts, Loans, Credit Cards, and Insurance. Each data marts represent a collection of data pertaining to a single business process. In loan data mart DEXIA involved in disbursal of loan amounts for various purposes like: Personal Loan, Educational Loan, Vehicle Loan, Housing Loan, Consumer Durable Loans, etc. The company requires different level of analysis regarding loan amount, type of Loans, type of customers, type of payment schedules, interest rates (variable or fixed), defaulters list and the penal interest calculations, etc. They needed a data warehouse so that they could maintain historical data and central location for integration, analyze business in different locations according to Profit areas.

Responsibilities:

Involved in gathering and analyzing the requirements and preparing business rules.

Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract the Data from multiple source systems like SAP, and Flat Files and loaded into Oracle.

Implemented different Tasks in workflows which included Session, Command and E-mail etc.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements

Used pre-session and post-session scripts for dropping and recreating indexes on target table.

Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.

Involved in fixing Invalid Mappings, testing of Stored Procedures and Functions, Performance and Unit Testing of Informatica Sessions, Batches and Target Data.

Worked with Developers to troubleshootand resolve mapping / workflow / session logic, as well as performance issues in Development, Test and Production repositories.

Environment: Informatica 7.1.2, Oracle8i, Flat Files and UNIX.

Client: Alliance Bernstein, USA

Role: ETL/Informatica Consultant

Description: Alliance Bernstein is a U.S. based investment management firm. It provides diversified, global investment management services that include growth, thematic, and value equities, blend strategies and fixed income services to institutional, high net worth individuals, and retail clients worldwide. It provides separately managed accounts, hedge funds, mutual funds, and other investment vehicles for private clients, including high net worth individuals, trusts and estates, charitable

foundations, partnerships, private and family corporations, and other entities. The ETL process is used to load the funds information, accounts information and other financial transactional data using with source files into a Target Oracle. The ETL has been broken down in 3 areas based on the ETL Target area Extract, Staging and Target. This project is dealt in three modules which involves running the facts for every four hours in order meet the requirements.

Responsibilities:

Involved in gathering and analyzing the requirements and preparing business rules.

Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract the Data from multiple source systems like SAP and Flat Files and loaded into Oracle.

Implemented different Tasks in workflows which included Session, Command and E-mail etc.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements

Used pre-session and post-session scripts for dropping and recreating indexes on target table.

Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.

Involved in fixing Invalid Mappings, testing ofStored Proceduresand Functions, Performance and Unit Testing of Informatica Sessions, Batches and Target Data.

Worked with Developers to troubleshoot and resolve mapping / workflow / session logic, as well as performance issues in Development, Test and Production repositories.

Environment: Informatica 7.1.1, Oracle8i, Flat Files and UNIX.

Client: AGS (Al Gurg Stationery), UAE

Role: Manual Testing

Description: Al Gurg Stationery (AGS) is one of the largest stationery suppliers in the U.A.E market. Al Gurg Stationery deals in some of the finest stationery products, ranging from paper clips, pens and box files to sophisticated drawing office equipment and accessories. It consists of four interfaces namely Customer Interface, Business data management interface, SAP Interface and Payment Gateway Interface. Customer Interface is the entry point for the customers and sales force. Involved in the Regression Testing of Customer Interface

Responsibilities:

Understanding the client requirements by studying functional document.

KT sessions to team members on Regression Testing using SilkTest & SCTM.

Played a crucial role in evolving/supporting the team on SilkTest in Hexaware, Mumbai.

Implemented RBT (Requirements Based Testing) approach

Identified test requirement and checked if the test cases were complete to test the business functionality.

Executing the scripts and reporting the bugs through defect tracking tool.

Worked with Developers to troubleshootand resolve mapping / workflow / session logic as well as performance issues in Development, Test and Production repositories.

Environment: Manual Testing, Silk Test, SCTM (Test Management), StarTeam JAVA, JSF, AJAX, J2EE, MS SQL Server 2005, Windows XP.



Contact this candidate