Post Job Free
Sign in

Data Warehouse Engineer

Location:
Thousand Oaks, CA
Posted:
September 28, 2015

Contact this candidate

Resume:

Suma Kurnala Santa Clara, CA *****

View LinkedIn Profile Cell: 484-***-**** acrvtf@r.postjobfree.com

Over 10 Years of IT Experience in

DATA WAREHOUSE ANALYSIS / DESIGN / DEVELOPMENT / TESTING / IMPLEMENTATION

Overview

Experience utilizing ETL methodology for supplying data extraction, transformations, and loading to process into a corporate–wide ETL solution using Informatica Power Center, Pentaho, and Oracle Warehouse Builder.

Extensive involvement in relational database / data warehouse environment including experience with slowly changing dimensions (SCD), operational data stores, and data marts.

Proficient in all phases of the software development lifecycle (SDLC) including requirements definition, system implementation, system testing, acceptance testing, production deployment, and production support.

Data warehousing project experience with sources from ERP and CRM applications to extract data from various Oracle EBS modules such as order management, TCA, and financials (accounts payable / receivable).

Implemented Informatica Data Quality (IDQ) as a web service where it is triggered from one or more applications. Exposed Informatica mapplet as a web-service and generated WSDL URL using IDQ.

Utilized HiveQL for data review, querying, and analysis. Used Hive queries to extract Omniture data which is residing in Hadoop.

Experience loading data into Teradata Database. Implemented TPT (Teradata Parallel Transporter) connection as part of performance tuning by creating stored procedures.

Well-honed SQL and PLSQL script writing skills utilizing Oracle 9i, 10g, and 11g.

Exposure to Dimensional Data modeling that includes, Star schema and Snow flake schema modeling.

Data analysis and building reports and dashboards based on high-level business requirements using OBIEE.

Developed Oracle BI utilizing OBIEE at three layers (physical, business model & presentation layer), time series objects, and interactive dashboards with drill-down capabilities using global and local filters, security setup, groups, access/query privileges, web catalog objects (dashboard, pages, folders, reports) and scheduling of iBots.

Followed Agile/Scrum methodologies for Software Development Life Cycle.

Experience in Onsite-Offshore model to coordinate with offshore developers and teams.

Areas of Expertise: Business Intelligence, Data Warehousing, Data Integration, Data Extraction, Data Analysis, Data Modeling, Production Support, Performance Tuning, Documentation, Reporting, Testing.

Technology Skills

Operating Systems: Win 98/2000/NT; Win XP; UNIX

Programming Languages: PLSQL; SQL; HTML; XML; Shell Scripting

Databases: Oracle; HiveQL; SQL Server; Teradata 13.0; MySQL

ETL Tools: Informatica Power Center 9.6,9.5,9.1,8.6,7.1; IDQ; Pentaho 4.3; OWB (Oracle Warehouse Builder) 10g

Reporting: OBIEE 10.1.3.4.1; OBIEE 11.1.1.7; Tableau

Big Data: Hadoop; Hive; Map Reduce

Configuration Management: Perforce; GIT; AccuRev

Scheduling: Appworx; IBM Tivoli; Informatica Scheduler

Modeling: Microsoft Visio; Erwin 4.0

Other Tools: Agile PLM 9.3.1.1; JIRA

Utilities: TOAD; SQL Developer; Remedy; WINSCP

Experience

MCAFEE / INTEL SECURITY Santa Clara, CA

Sr.Data Warehouse Engineer – Omniture CDW Integration Aug’14 - Present

Track meta offer parameters from landing page to home, cart, and persist in ebizstats so that consumer data warehouse (CDW) can generate reports based on the data.

Play an integral role in client interaction, design, coding, testing, release, and support.

Develop reliable, efficient, and maintainable McAfee CMSB data warehouse and ETL solutions.

Responsible for entire BI database, and make sure data availability for end users on time.

Conduct detailed source-system analysis, source-to-target data analysis, data cleansing, data quality validation, and data testing and transformation analysis.

Develop batch and real-time ETL applications using Informatica to populate the data warehouse and perform unit testing / system integration testing.

Prepare technical specification documents and migration documents.

Accountable for analyzing and completing change requests within the specified time limits to adhere to SLA.

Develop ETL solutions utilizing Informatica to fetch data from external sources to populate the data warehouse.

Performance tuning of ETL jobs as well as database tuning.

Experience in importing and exporting data from different RDBMS like MySQL and SQL Server into HDFS and Hive using Sqoop. Writing HQL queries for Data Integration.

Writing ETL code on Big data platform Hadoop -Hive, and creating UDF.

Experience in designing both time driven and data driven automated workflows using Oozie.

Provide production support while monitoring jobs and working on enhancements and change requests.

Used Hive queries to extract Omniture data which is residing in Hadoop and implemented ETL solutions using Informatica to populate the data warehouse.

Informatica Power Center 9.6, 9.5.1, Pentaho 4.3, SQL Server, UNIX, HIVE,Map Reduce.

LOGITECH INC. Newark, CA

Sr.ETL Informatica Developer – SFDC CRM Data Integration Nov’13 – Jul’14

Integration of product-related data across in-house systems from Oracle ERP, RightNow and Renga Data Bases which are in MySQL) into Cloud (SFDC). The new SFDC System will be replacing existing in-house CRM solution.

Developed and improved Informatica mappings for enhanced performance; utilized Informatica partitioning at session level.

Loaded data to the interface tables from multiple data sources such as text files and Excel spreadsheets as part of ETL jobs.

Experience in Informatica cloud integration with Salesforce as inbound as well as outbound.

Implemented Informatica data quality (IDQ) as a web service where it is triggered by one or more applications.

Exposed Informatica mapplet as a web-service and generated WSDL URL using IDQ.

Used Informatica transformations such as Java, Normalizer, XML Parser, Active/Passive Lookup, Transaction Control, SQL, Update strategy, Router, Aggregate, Expression, and Rank.

Improved Performance of the workflows by identifying the bottlenecks in targets, sources, mappings, sessions.

Used materialized views to extract data from various Oracle EBS Modules such as order management and TCA as well as financial data such as accounts receivable & accounts payable.

Analyzed data flow requirements and developed a scalable architecture for staging and loading data and translated business rules and functionality requirements into ETL procedures.

Informatica Power Center 9.5, 9.1.0, Toad, Oracle 11.2.0.3, PL/SQL, RDBMS, OBIEE 11.1.1.7, SQL Developer, UNIX.

AMAZON LAB126 Cupertino, CA

BI Consultant – MBR & WBR Automation Jul’12–Oct’13

Amazon firmly established Monthly Business Review (MBR) and the Weekly Business Review (WBR) as two important Business Management System mechanisms. Provided data extracts that were used to drive various production planning metrics in the Kindle WBR document packages.

Participated in all phases of the software development life cycle (SDLC): Client Interaction, requirements gathering, design, coding, testing, release, support, and documentation.

Interacted with management to identify key dimensions and measures for business performance.

Helped define mapping rules from data sources and fields.

Extensively used RDBMS and Oracle concepts and involved in performance tuning.

Strong experience on understanding of Business Intelligence and Data Warehousing Concepts with emphasis on ETL and SDLC (Software Development Lifecycle), Quality analysis.

Investigate processes to determine the optimal method to implement changes while minimizing impact to existing systems.

Interact with functional experts, at all levels, to understand business issues, challenges and identify new opportunities

Accountable for data analysis activities used for developing dashboards and other end user solutions.

Collaborated with business analysts and acted as liaison between technical departments and business units.

Developed data access queries and used OBIEE tools like Answers; built dashboards and assisted in development of complex ad hoc queries.

Informatica Power Center 9.1, Pentaho 4.3, Toad, Oracle 10g, PL/SQL, OBIEE 11g, Perforce, SQL Developer, RDBMS, Shell Programming.

SYMANTEC Mountain View, CA

ETL Developer – El Dorado May’10–Jun’12

Assisted Symantec with building its own platform for renewal customers to increase business and reduce costs.

Project encompassed the auto-billing system (ABS), which involved opt-in and opt-out options for customers and auto-renewal after successful transactions.

Developed a solid understanding the functional specifications and the documents related to the architecture.

Wrote technical specifications and migration documents.

Created Informatica mappings, sessions, and workflows.

Worked on extraction, transformation, and loading of data using Informatica.

Developed a Conceptual model using Erwin based on requirements analysis

Wrote B-teq scripts for complex extraction logic mappings

Utilized TPT on Teradata to improve performance while loading data into targets, creating stored procedures.

Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed jobs into the production environment.

Identified and Removed Bottlenecks in order to improve the performance of mappings and workflows.

Generated Informatica user defined functions and reused them for specific use cases.

Optimized the mappings using various optimization techniques and debugged, tested and fixed the mappings.

Informatica Power Center 9.1, Toad, Oracle 10g, PL/SQL, Teradata, Appworx, Erwin, Perforce, Shell Programming.

CISCO Milpitas, CA

ETL Consultant – Performance Metrics Central 2.0 Apr'09 – Apr’10

Performance Metrics Central (PMC) was a “one-stop-shop” location for Cisco and partners to review and manage performance on critical Cisco channel and CA partner programs. Identified target areas for profitable growth and operational efficiency by providing up-to-date, accurate, and consolidated information on performance metrics and operational indicators.

Gained solid understanding of legacy systems data and building, designing target DB schema.

Collaborated with management to identify key business performance metrics.

Defined mapping rules by identifying required data sources and fields.

Created entity relationship (ER) diagrams.

Dimensional modeling using STAR schemas (facts and dimensions).

Generated weekly and monthly incident reports handled by the support team.

Worked on data conversions and data loads using PL/SQL

Designed mapplet to update a slowly changing dimension table to keep full history that was fully implemented.

Contributed to support activities in 24x7-production support environment. Monitored jobs and worked on enhancements and change requests.

Gained familiarity with data analysis and building reports and dashboards using OBIEE.

Informatica Power Center 8.6, Oracle 10g/11g, PL/SQL, Toad, UNIX Scripting, OBIEE 10.1.3.4.1, Remedy.

PROJECT: MIC Migration

Programmer Analyst(Offshore) - COGNIZANT TECHNOLOGY SOLUTIONS LTD. Chennai, India Aug’05 – Mar’07

ETL Developer(Onsite) - PFIZER Parsippany, NJ Apr’07– Mar’09

Combined the financial processing activities of 19 European countries into one single platform focused suite.

Targeted accounting transactions such as invoice payment, VAT, tax returns, and expense reports in one dedicated site, enabling each Pfizer location to focus on its core business activities.

Involved in the analysis, design, and creation of the enterprise data warehouse.

Designed and built the physical data model for ODS for Data warehouse using Erwin.

Developed Informatica mappings, mapplets, and transformations to load data from relational and flat file sources into the data warehouse.

Used SQL queries and database programming using PL/SQL (writing packages, stored procedures/functions, and database triggers).

Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Worked extensively on the database triggers, stored procedures, functions, and database constraints.

Optimized the mappings using various optimization techniques.

Performance tuning of ETL jobs as well as database tuning.

Provided production support, which involved working on support tickets and enhancements.

Generated weekly and monthly report status for the number of incidents handled by the support team.

Informatica Power Center 8.1, Oracle Warehouse Builder (OWB) 10g, Erwin 4.5, RDBMS, Oracle 10g/9i, PL/SQL, Toad, UNIX Scripting, IBM Tivoli.

Education

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY Master of Technology: Mechanical Engineering



Contact this candidate