Post Job Free

Resume

Sign in

Data Customer Service

Location:
Bengaluru, KA, India
Posted:
July 12, 2017

Contact this candidate

Resume:

Radhika Koppula

Mobile: 732-***-**** E-Mail: ac09wn@r.postjobfree.com

PROFESSIONAL SUMMARY:

Over 9 years of experience in Information Technology as an ETL Developer, experience in all phases of life cycle of Requirement Analysis, Functional Analysis, Design, Development, implementation, Testing, Debugging, Productions Support, and Maintenance of various Data Warehousing Applications.

8 years of experience in designing and development of ETL Methodology using Informatica Power Center 9.6.1/9.0.1/8.x, Informatica PowerExchange 9.6.1/9.0.1, Informatica Data Quality 9.6.1.

Experience in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets and Data marts and Data warehouse.

Expert in performance tuning of Informatica mappings, identifying source and target bottlenecks.

Experience in Bill Inmon and Kimball data warehouse design and implementation methodologies

Expertise in OLTP/OLAP System Study, E-R modeling, developing Database Schemas (Star schema and Snowflake schema) used in relational and dimensional modeling.

Extensive Experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files

Experience on Informatica Cloud Integration for Amazon Redshift and S3

Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.

Expert in Performance tuning, troubleshooting, Indexing and partitioning techniques on Sources, Targets, Mappings and Workflows in Informatica.

Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint

Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.

Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch

Extensive experience in Data analysis, ETL Techniques, MD5 logic for loading CDC.

Experienced in writing UNIX shell scripts, SQL Loader, Procedures/Functions, Triggers and Packages.

Superior SQL skills and ability to write and interpret complex SQL statements and also mentor developers on SQL optimization and ETL debugging and performance tuning

Exposure of end to end SDLC and Agile methodology

Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts

Worked many years in onsite offshore model. Provided technical design leadership to ensure the efficient use of offshore resources and the selection of appropriate design, ETL/CDC logic.

Extensive experience of providing IT services in Healthcare, and Banking industries.

Strong Knowledge of Hadoop Ecosystem (HDFS, HBaase, MapReduce, Hive, Pig, NoSQL etc.)

Possess excellent communication skills, problem solving and analytical skills with the ability to work independently and with the Team.

EDUCATION:

Bachelor in Computer science, Anjalai Ammal Mahalingam Engineering College, Tamil Nadu, India, 2000.

TECHNICAL SKILLS:

ETL Tools

Informatica Power Center(9.6.1/9.0.1/8.x /7.x), PowerExchange 9.6.1/9.0.1, Informatica Data Quality 9.6.1, Power connect for SAP BW, power connect for JMS, power connect for IBM MQ series, Datastage 8.0, power connect for Mainframes, DTS

BI Tools BI Tools

Business Objects 5.1, Cognos Impromptu 6.0, Powerplay 6.6, and Oracle 9i Discoverer.

Data Modeling Data Modeling

ERwin 9.5.2/7.3/4.1, MS–Visio 2013, UML, Oracle designer.

Databases Databases

Teradata 14/V2R6/V2R5, Oracle 12c/11g/10g/9i, SQL Server 2005/2008,DB2,MySQL, Sybase IQ, Informix, Netezza, Amazon Redshift, MS Access

Languages

XML, Java, HTML, JavaScript, PL/SQL C++, C, UNIX Shell Scripting, SQL, PL/SQL.

WEB Services

SOAP, WSDL

Big Data

Hadoop Ecosystem (HDFS, Hive, Pig),

OS

MS-DOS, HP UNIX, Windows and Sun OS.

Methodologies

Ralph Kimball’s Star Schema and Snowflake Schema.

Others

MS Word, MS Access, T-SQL, TOAD, SQL Developer, Microsoft Office, Teradata View Point, Teradata SQL Assistant, Icescrum, JIRA,, Control–M, Autosys, GitHub

PROFESSIONAL EXPERIENCE:

Client: StubHub an eBay Company. San Francisco, CA

Role: Sr. Informatica Developer

Duration: May 2014 to Current

Project Description:

StubHub is an online ticket exchange owned by eBay, which provides services for buyers and sellers of tickets for sports, concerts, theater and other live entertainment events. It has grown from the largest secondary-market ticket marketplace in the United States into the world's largest ticket marketplace.

StubHub offers a wide range ticketing service; Sellers can post available tickets at any price they choose. A range of tickets are listed, mainly sporting, concert, theater, and other live entertainment events. it had over 16 million unique visitors and nearly 10 million live events per month. The objective of this project to integrated data mart for long-term decision-making, strategic plans, support and solutions giving executive users and also distribute customized report to their vendor.

Responsibilities:

Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple source.

Defined and developed brand new standard design patterned, ETL frameworks, Data Model standards guidelines and ETL best practices

Provided technical design leadership to this project to ensure the efficient use of offshore resources and the selection of appropriate ETL/CDC logic.

Performed detailed data investigation and analysis of known data quality issues in related databases through SQL

Actively involved in Analysis phase of the business requirement and design of the Informatica mappings.

Performed data validation, data profiling, data auditing and data cleansing activities to ensure high quality Cognos report deliveries

Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Datastage.

Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.

Developed Informatica mappings for TYPE 2 Slowly Changing Dimensions.

Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.

Created sessions and work-flows for the Informatica mappings.

Heavily used Informatica Cloud integration using Amazon Redshift connector and integrated data form various sources

Configured sessions for different situations including incremental aggregation, pipe-line partitioning etc.

Created mappings with different look-ups like connected look-up, unconnected look-up, Dynamic look-up with different caches such as persistent cache etc.

Created various Mapplets as part of mapping design.

Involved in writing Oracle stored procedures and functions for calling during the execution of Informatica mapping or as Pre-or Post-session execution.

Created effective Test Cases and performed Unit and Integration Testing to ensure the successful execution of data loading process.

Documented Mappings, Transformations and Informatica sessions.

Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.

Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.

Extensively involved in testing by writing some QA procedures, for testing the target data against source data.

Worked on Data Profiling, Data standardization, error handling and exception handling.

Worked on Merge, match and reference table to find duplicate and generate report.

Written Unix shell scripts for file manipulation, ftp and to schedule workflows.

Co-ordinated offshore team on daily basis to leverage faster development.

Environment: Informatica Power Center 9.6.1, Informatica PowerExchange 9.6.1, Informatica Data Quality 9.6.1, Datastage 8.0, Amazon Redshift, Cognos 9.0, Sun Solaris, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Autosys, Shell Scripting, Icescrum, JIRA, Teradata 14, Control–M, Autosys, GitHub, Hadoop, Hive, Cognos 9

Client: Maimonides Medical, Brooklyn, NY

Role: Informatica Developer/ Data Analyst

Duration: Sept 2012 to May 2014

Project description: Maimonides Medical being one of the biggest in east coast. Business Users a mechanism to track the material usage in OR and the charges to the patients and payments to the Vendors. This involved the Integration of People soft / ORMIS (Centricity Peri-operative system) / AHS – (American Health ware System) and TSI in which I have played a key role as an ETL developer.

Responsibilities:

Participated in the Design Team and user requirement gathering meetings.

Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, Mapplets, transformations, re-usable transformations.

Created different source definitions to extract data from flat files and relational tables for Informatica Power Center.

Used Star Schema approach for designing of database for the data warehouse

Developed a standard ETL framework to enable the reusability of similar logic across the board.

Created different target definitions using warehouse designer of Informatica Power center.

Created different transformations such as Joiner Transformations, Look-up Transformations, Rank Transformations, Expressions, Aggregators and Sequence Generator.

Created stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.

Extracted source data from Oracle, SQL Server, Flat files, XML files using Informatica, and loaded into Netezza target Database.

Extensively transformed the existing PL/SQL scripts into stored procedures to be used by Informatica Mappings with the help of Stored Procedure Transformations.

Used PL/SQL whenever necessary inside and outside the mappings.

Created Models based on the dimensions, levels and measures required for the analysis.

Validate the data in warehouse and data marts after loading process balancing with source data.

Created, launched & scheduled sessions.

Fixed Performance issue in Informatica mappings.

Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.

Customized Unix shell scripts for file manipulation, ftp and to schedule workflows.

Created design specification which includes BI dependency plan, job scheduling and cycle management documents

Worked closely with the business analyst’s and Production Support to resolve any JIRA issue.

Co-ordinated offshore team on daily basis to leverage faster development.

Environment Informatica Power Center 9.0.1, Informatica PowerExchange 9.0.1, Informatica DataQuality 9.0.1, Cognos 9.0, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Teradata, Aginity Workbench for Netezza, SQL Server 2012, Control M, Shell Scripting, XML, SQL Loader

Client: Walmart, Bentonville, AR, US

Role: Informatica Developer

Duration: June 2010 to Sept 2012

Project Description: This project involves building a data warehouse by consolidating data from a variety of systems into a central data warehouse and storing historical information. This helped the management to quickly identify trends and respond to changing market and economic conditions by achieving its continued goals of excellent customer service and solid financial performance.

Responsibilities:

Involved in design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.

Using Informatica Designer designed Mappings, which populated the Data into the Target Star Schema on Oracle Instance.

Optimized Query Performance, Session Performance and Reliability.

Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.

Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.

Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.

Scheduled the batches to be run using the Workflow Manager.

Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.

Performed Unit testing and Integration testing on the mappings in various schemas.

Optimized the mappings that had shown poor performance

Monitored sessions that were scheduled, running, completed or failed. Debugged mappings for failed sessions.

Involved in writing UNIX shell scripts for Informatica ETL tool to run the sessions.

Coordinated between the developments and testing teams for robust and timely development of fully integrated application.

Constantly monitored application attributes to ensure conformance to functional specifications.

Mentored the development members on ETL logic and performed code and document reviews

Environment: Informatica Power Center 8.6, Informatica, SQL Server 2008, Oracle 10g, Shell Scripts, Erwin, TOAD, UNIX, Cognos 9, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center

Client: RCI, Carmel, IN, US

Role: Informatica Developer/ Data Analyst

Duration: Sept 2008 to May 2010

Project Description: RCI is a division of the firm Wyndham Worldwide is the world's largest holiday timeshare company. The main aim of the project was to develop and maintain an application to load data from various data sources to the Enterprise Data warehouse called PMS (Product Management System). The project aims to help make decisions for new product improvements, analysis of existing products and improve customer service.

Responsibilities:

Understand the problem description stated in BRD and design and write ETL Specifications.

Importing source/target tables from the respective databases and created reusable transformations and mappings using Designer Tool set of Informatica.

Prepared Test Cases and performed system and integration testing.

Created Informatica mappings to load the data from staging to dimensions and fact tables.

Configured the mappings to handle the updates to preserve the existing records using update strategy transformation.

Scheduled Sequential and Concurrent sessions and batches for loading from source to target database through server manager.

Developed Informatica mappings, mapplets and transformations.

Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update Strategy, Rank, Expression and lookups (connected and unconnected) while transforming the Data according to the business logic.

Created sessions, reusable worklets and workflows in Workflow Manager

Used Event wait to trigger the file and run the process.

Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.

Conducting unit testing and Prepared Unit Test Specification Requirements.

Used power connect to extract data from mainframes database.

Prepared data base request form and Production release document.

Creating / Updating and Running / scheduling Batches and Sessions.

Created PL/SQL Scripts and Stored Procedures for data transformation on the data warehouse.

Environment: Informatica Power Center 8.1, Business Objects 6.0, Oracle 10g, SQL Loader, PL/SQL, SQL Server 2004, UNIX Shell Programming, Linux and Windows NT

MY working visa status is USC.



Contact this candidate