Post Job Free

Resume

Sign in

Data Warehouse Manager

Location:
Plano, TX, 75093
Salary:
$ 96000 - $110000
Posted:
July 01, 2012

Contact this candidate

Resume:

Sampath Rudravaram

Data Warehouse/Business Intelligence Solutions Architect

Email: kxu3g0@r.postjobfree.com

Ph.No: 719-***-****

Address: **** ************** ****, *** ***

Plano, TX 75093

Summary

I am a highly motivated, Solution Oriented Data warehouse specialist with notable success in implementing broad range of Data warehouse and Business Intelligence Projects and expertise in process improvement, systems integration, requirements analysis, workflow design, project management, leadership, process modeling, testing/quality assurance, software development, customer service, estimating.

Education

New Mexico State University

MS, Computer Science, 2004 – 2006, 3.8 GPA

Jawaharlal Nehru Technological University

B-Tech, Electrical and Electronics Engineering, 2000 - 2004

Highlights:

Over 9 years of overall IT experience, primarily on Data warehouse ETL Design and Development, Data Acquisition & Analysis, Business Intelligence reporting, Data Modeling , Project Management, Requirement Gathering and testing.

Experience on large, complex projects doing requirements elicitation, working directly with business users.

Experience in implementing a BI/DW Infrastructure on a large scale: ETL, Data Warehouse, Data Marts, and Analytics.

Experience in complete life cycle implementation of an Enterprise Data Warehouse, Good understanding of Ralph Kimball and Bill Inmon Data Warehouse methodologies and knowledge on the various data design models like Star schema and Snowflake models.

Extensive experience and in depth understanding in Data Warehousing concepts and ETL methodologies.

Experience with various development methods like "waterfall" and "iterative".

Experience in providing technical/process direction to other developers, including mentoring others in BI, ETL, Informatica areas.

Experience in Dimensional Data Modeling and Relational Data Modeling, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Physical & Logical Data Modeling using Sybase power designer, Case Studio and ERWIN.

Strong problem solving skills, including direct experience with production systems, and support

- Strong data analysis skills, with data profiling experience.

7+ years of strong experience in Analysis, Design, Development, Testing, performance tuning and deployment of ETL jobs.

- 7+ years of experience in designing complex Extract Transform & Load (ETL) strategies using ETL tools like Informatica Power Center 8.x and SSIS.

- Extensive experience in Oracle databases 11i, 10g, 9i and 8i versions, Teradata (BTEQ ,FastLoad ,MultiLoad ,T pump,Fast Expert and Queryman), DB2 and MS SQL Server Database in SQL, PL/SQL Programming, Packages, Triggers, Constraints, Materialized Views, External Tables, TSQL and Dynamic SQL.

- Extensive experience using UNIX (K-Shell) shell scripting created data migration programs and scheduling jobs.

Experience in design and development of reports and cubes using BI Products: Business Objects 6.0 (Universe and Crystal Reports), Cognos Impromptu and Cognos ReportNet, SSRS and SSAS on windows.

Other tools: SQL Expert, SSMS, SQL Navigator, Explain Plan, TOAD, SQL Navigator, Export/Import and SQL*Loader, Visual SourceSafe, QMF for Windows and Microsoft Office suite.

Results-oriented with agile development practices for rapid development in an enterprise environment.

Experience with project management and workflow software applying critical path and risk management and uses innovative tools and metrics.

Energetic, responsive, and detail-oriented, with proven ability to find solutions to clients' needs using innovative methods for getting close to customers and achieving customer focus, customer loyalty, and customer satisfaction

Able to manage multiple projects and produce quality work under tight deadlines.

Good communication and people skills, including ability to work with both technical and non-technical people as well as under took measures for consensus building across multiple teams and departments.

Outgoing, friendly team player and takes job seriously ensuring 24/7 availability.

Experience

Architect - Data Warehouse/Business Intelligence at RealPage, Inc.

August 2011 - Present

Tools and Environment

SSIS, Cognos 10 (Framework and Transformer), MS SQL Server 2008 R2, MS SQL Server 2005, Java, Erwin, Case Studio, TSQL

Project Summary:

BI as a Service: Created a Top Down Design Model (Bill Inmon approach) Data Warehouse for several Business areas such as Marketing Services, Operational Services, Residents, Facilities, Candidate Screening etc for Multi-Family industry. Data Warehouse will be servicing thousands of customers through data marts utilizing Cognos Packages and Cubes. Monolithic hosted data across multiple products into single source solution deals with a real estate development company that focuses on multi-family residential projects, including real-estate market analysis. Multiple Data marts will be used for analysis reports and Master warehouse will be used for Bench-marking and Trending. Expected growth of warehouse is about over 100TB in a year.

Responsibilites:

Designed and developed scalable and reusable ETL components that reflect complex business logic using MS SQL Server 2008 SSIS.

Designed and developed complex Stored Procedures and T-SQL scripts to enable the performance of ETL.

Decoded and converted complex Java code, T-SQL code into SSIS ETL routines.

Developed ETL specifications, source to target mapping and other documentation required for ETL development. Ensure appropriate documentation is complete and stored in the appropriate repository.

Performed Unit tests on development to verify compliance with specifications.

Fine tuned complex queries against very large databases for performance.

Designed indexing strategies that provide optimum query perform, while balancing ETL performance.

Created data models (logical & physical) which serve as blue prints for project engagements of all complexities using Erwin and Case Studio.

Designed the implementation of data transformation and change processing procedures to ensure that data from multiple sources is aggregated, normalized and updated and then published to multiple target environments.

Developed proof-of-concept designs and technical guidelines for the architectural blue prints and facilitated the development of best practices and standards.

Performed analysis of business processes in order to determine and develop technical solutions to meet business objectives.

Translated business requirements and models into feasible and acceptable data warehouse designs.

Wrote technical specifications and determine time estimates.

Designed and built appropriate data repositories and dimensional databases to ensure that business needs are met.

Interacted with clients while designing internal and external data interfaces to ensure that database development needs are according to client specifications

Interacted with Business Analysts and staff within business units to determine options for addressing business requirements.

Reviewed and analyzed new software tools and packages.

Provided technical leadership for the company’s commercial reporting solution strategy including the definition and implementation of Security requirements, Meta data standards and Data dictionary.

Data Warehouse Developer - ETL at FedEx Services

June 2006 - August 2011

Tools and Environment:

Oracle 9.2, Oracle 10i, Teradata V2R4, Informatica 8.x , Business Objects, Crystal Reports, Cognos impromptu, UNIX

Project Summary:

Service and operations measurement: Created a solution for measuring the service quality of FedEx deliveries for the FedEx Freight segment. FedEx is a 40 billion dollar industry that boasts 99.8% accuracy in its deliveries. In order to maintain this high level of service, it is critical for the company to monitor the service it provides to its customers on a daily basis. The application, Service Quality Indicators (SQI) is in the top 3 most important applications specified by CEO. Application measures over 25 metrics driven by complex business logic. The data warehouse solution follows Bottom-Up (Kimball approach) in its design. I was the subject matter expert (SME) for this application. The reporting solution was implemented in SAP Business Objects, with Java based GUI.

Operations Performance Excellence, is an application that measures the performance of drivers, dock-workers and part-time labors in FedEx Freight service centers. A dashboard application was built using SAP Business Objects and the data was driven by real-time updates. Implemented Active Data Warehouse (ADW) through Change Data Capture (CDC) using informatica. We were able to achieve near real-time data with every few minute update strategy. The data warehouse consisted on more than 50 tables measuring almost 30 different metrics.

The Data Warehouse databases were developed in Oracle 10i, with sources from DB2 (OLTP) and Teradata )Enterprise Data Warehouse (EDW)). I have developed many complex ETL solutions for Extracting, Transforming and Loading the data to and from all these environments, using Informatica.

Responsibilites:

Designed and developed mappings/sessions/workflows to move data from source to reporting data marts for complex Business Objects Business Intelligence projects.

Developed ETL mappings and scripts using Informatica PowerCenter 8.x using Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer), Repository Manager, Workflow Manager & Workflow Monitor

Created Transformations like Lookup, Joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Tuned ETL Performance targets, sources, mappings and sessions using various components like Parameter files, Variables and Dynamic Cache, extensively worked in Performance tuning using round robin, hash auto key, Key range partitioning

Tuning Database SQL Queries for databases such as Oracle, DB2, and Teradata

Unit and Integration Testing of ETL components

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.

Production on-call support for FXF Data Warehouse applications.

Business requirement gathering and analysis.

System and technical requirement analysis.

Data Acquisition research and analysis.

Converted business requirements into High level and Detailed Design (Transformation Specifications), Test cases and Test plans

Interacted with business users to identify key Dimensions and Measures for different Key Performance Indicators (KPI)

Developed reports using Crystal Reports.

Involved in defining reporting standards for the report development team.

Developed and modified the catalogs using Cognos impromptu.

Technical lead for the migration of the reports from Cognos 7 to Business Objects 6. Business requirement gathering, design documentation, migration plans, implementation plans, end user training, testing etc were some of the responsibilities in this project.

Designed exceptions and error handling methodology

Participated in design reviews

Subject Matter Expert (SME) on various applications in FedEx Freight Data Warehouse environment.

Internal Project management for small projects

Created test plans and test cases.

Worked closely with Data Architects for data modeling of various data warehouse tables.

Developed PL/SQL scripts for data validation, data integrations, data cleansing, error reporting, performance enhancements etc.

Programmer/Analyst - Data Warehouse / Business Intelligence (Cognos) at

New Mexico State University (IRPOA).

August 2004 - June 2006

Tools and Environment:

Oracle 9i, Cognos ReportNet, SAS, C, HTML, MySQL/Dremaweaver, PL-SQL

Project Summary:

University Banner Initiative: Prestigious and multi-million dollar Banner project aimed at converting old Finance and HR modules into new software and creating a data warehouse for the university, led by Institutional research and planning organization (IRPOA). The implementation of data warehouse was first of its kind in the university and it changed the way the university accesses reports. The BI platform was Cognos ReportNet and ETL was performed through complex PL-SQL routines.

Developed PL/SQL ETL, error checking/data cleansing scripts, UNIX shell scripts and Oracle DDL Scripts.

Developed and Tested Cognos ReportNet and Cognos Powerplay reports.

Developed and created packages using Cognos Framework Manager.

Created Ad-hoc reports using Query Studio.

Created complex reports using Report Studio

Created Multi-Dimensional reports using PowerPlay.

Created reports with unions, joins, master-detail reports, drill-through, prompts, complex calculations, formatting, conditional formatting, bursting reports, crosstab, multi-query reports & charts.

Involved in the design, development and implementation of PowerPlay Cubes.

Involved in scheduling and categorization of Cognos ReportNet and Cognos Powerplay reports.

Gathered and documented the business rules from the business partners.

Created test plans

Created various employee and student surveys using Dreamweaver and MySQL database.

Developed SAS programs.



Contact this candidate