Post Job Free
Sign in

Certified ETL Consultant

Location:
Rochester, MN
Posted:
June 11, 2020

Contact this candidate

Resume:

Krishnaveena Ponugoti

HSBC Certified ETL Consultant

************.********@*****.***

www.linkedin.com/in/krishnaveena-ponugoti-250910a0

507-***-****

Around 9 years of IT experience in software design, development and implementation of high quality business solutions with major focus on Data Warehouse and ETL processing.

Involved in all the stages of Software Development Life Cycle (SDLC) and Production Support Life Cycle.

Strong knowledge in OLAP and OLTP Systems, Dimensional modeling using Star Schema and Snowflake Schema.

Strong Enterprise data warehousing experience using IBM DataStage Components like DataStage Designer, DataStage Director, DataStage Manager and DataStage Administrator.

Used Data stage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse.

Worked and extracted data from various sources like Oracle, DB2, Teradata and Flat files and loading into the staging area.

Expert in Data Warehousing techniques for Data Cleansing, Slowly Changing Dimension phenomenon (SCD), Surrogate Key assignment and CDC (Change Data Capture).

Experienced in using various processing stages (Aggregator, Change Data Capture, Lookup, Join, Filter, Surrogate Key Generator, Funnel, Remove Duplicate Stage and Transformer) and Development/debug stages (Peek stage, Row generator stage and Column generator stage)

Experience in various areas in banking and health care domain.

Involved in designing and preparing Functional specification documents, Technical specification documents, Mapping Documents for Source to Target mapping with ETL transformation and Test plan.

Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center Designer, Workflow Manager, Workflow Monitor.

Developed Informatica mappings that perform Extraction, Transformation and load of source data into target database using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, Look up, Rank, Joiner, Expression, Stored Procedure and Update strategy to meet business logic in the mappings.

Informatica reusable transformations and Mapplets are built wherever redundancy is needed.

Involved in Unit testing, SIT and UAT phases of the project.

Experienced in developing UNIX shell scripts for file validations, to run the batch processing and summarized report of data load.

Exposure to VI Editor Commands in UNIX environment.

Worked on various terminals like WinSCP, Putty to access UNIX/LINUX.

Good experience with Oracle, Teradata and DB2.

Extensively used Rapid SQL, TOAD and SQL Developer to access database and used Teradata SQL Assistant to access Teradata Database.

Demonstrated work in Oracle to Teradata DataStage Migration projects.

Worked on Scheduling tools like Control-M for automating jobs run and to handle event based scheduling and time based scheduling.

Used Quality Center (QC) to track defects and updated the status as and when the changes are done.

Used Team Foundation Server (TFS) for daily task assignments and completion.

Experience in performance tuning both at Application Code Level and Databases Level.

Have been a team player involved in many enhancements/developments and ensured successful completion of all the projects as per SLA (Service Level Agreements).

Experienced in conducting design review and code review sessions with the team.

Experienced in Onshore – Off Shore coordination Development and Production Support Team.

Extensive experience in software development methodologies Agile and Waterfall models.

Have been to Onsite for business requirements gathering at client location in Turkey.

TECHNICAL SKILLS

Environments

Windows, Unix & Linux

Conceptual Knowledge

Data warehouse and Data modeling

ETL Tools

IBM InfoSphere Datastage & Informatica Power Center

Languages

SQL,Unix and Linux shell scripting

Database

Oracle, DB2, SQL Server & Teradata

Methodology

Agile & Waterfall

Management Tools

Jira, TFS, HP Quality Center (QC)

Scheduling Tools

Control M & IBM Tivoli Workload Scheduler

Other Tools

Putty, WinScp, TOAD, SQL Developer, SQL Assistant, Rapid SQL

Domains

Healthcare & Banking (Internet Banking, Regulatory & Compliance, Incentive management )

EDUCATION

Bachelor of Technology in Information Technology from Jawaharlal Nehru Technological University, Hyderabad, India.

CERTIFICATIONS

HSBC DataStage Proficient certification.

IBM InfoSphere DataStage 9.1 Professional Certification.

WORK EXPERIENCE

Senior ETL Developer

Mayo Clinic, Rochester, MN, USA

Sep 2018 – Till date

Mayo Clinic Laboratories Data Analytics UDP project (MCL DA UDP) provides a data analytics solution for the MCL Division on the Mayo Clinic UDP Platform. This project involves migration of the data aggregation from various source applications to ADS (Atomic Data Store) and transforms data to build the DataMart in UDP which is used by the Business Intelligence team to build reports. This project provides data analytics solution for a) Profitability solution (Revenue and Cost detail Report) replacement and b) Test volumes and associated billing and revenue.

Responsibilities

Assess requirements for completeness and accuracy

Conduct impact assessment and determine size of effort based on requirements

Design and develop Extraction, Transformation and Load process (ETL) to load data from various data sources like DB2, SQL Server and Sequential Files to staging database and from staging to target Data Warehouse database using Datastage designer

Develop parallel jobs using various processing stages (Aggregator, Change Data Capture, Lookup, Join, Filter, Surrogate Key Generator, Funnel, Remove Duplicate Stage and Transformer) and Development/debug stages (Peek stage, Row generator stage, and Column generator stage)

Develop Datastage jobs to implement Slowly Changing Dimension phenomenon (SCD) Type2 using Change Data Capture Stage

Create required parameter sets, parameters and environment variables to be passed to the Datastage jobs which run through the job sequencers

Create complex SQL Queries to get source data from DB2 and SQL Server Database using Database Connector stages in Datastage Designer

Detailed review of all the developed ETL jobs and identify gaps in coding

Job scheduling in Datastage Director for automating the data loads on daily basis

Provide training and knowledge sharing sessions to resources in the team to make them understand the application and ETL approach to achieve the business requirements

Involve in Agile scrum calls, Iteration planning and Retrospective meetings

Optimize/Tune Datastage jobs for better performance and efficiency and Choosing right partitioning and sorting techniques to improve Datastage job performance

Review Datastage jobs as per the coding standards and provide review comments.

Environment: IBM InfoSphere Datastage 11.5, DB2, SQL Server, Oracle, UNIX, IBM Tivoli Workload Scheduler.

Senior ETL Consultant

HSBC Bank, Hyderabad, India

Apr 2017 – Oct 2017

CODS (Compliance Operational Data store) is a Centralized database which provides the 360 degree view of a customer, account and compliance related data along with the transaction data to support Compliance regulatory needs. This project mainly deals with tracking suspicious activities in the banking transactions. Delivers accurate and sustainable management information reporting that addresses regulatory concerns raised against HBUS compliance related data from the disparate source systems into CODS.

Responsibilities

Involved with Business users and ETL Leads from different teams to implement ETL

Frame Work using InfoSphere DataStage.

Implemented a DataStage based ETL solution fulfilling stringent performance requirements.

Collaborated with the reporting teams to understand the reporting requirement and to load the data as per the business requirement to be populated on the end user reports.

Worked on creating Technical design specification, Source to target mappings (STM) and Test plan.

Assess requirements for completeness and accuracy.

Determine if requirements are actionable for ETL team.

Conducted impact assessment and determine size of effort based on requirements.

Developed full SDLC project plans to implement ETL solution and identify resource requirements.

Created various standard/reusable jobs in DataStage using various active and passive stages like Sort, Lookup, Filter, Join, Transformer, aggregator, Change Capture Data, Sequential file, Datasets.

Written complex SQL Queries to get source data from Oracle Database using Oracle Connector in DataStage Designer.

Developed DataStage jobs to implement Slowly Changing Dimension phenomenon (SCD) Type2 using Change Capture Stage.

Performed debugging using Peek stage during the Unit testing of the DataStage jobs.

Automated DataStage job runs using the Control-M scheduling tool.

Developed LINUX Shell scripts to run DS jobs through Control M.

Monitored and resolved few of the data warehouse ETL production issues.

Followed Agile methodology in the project and used Kanban board for tracking progress of ongoing project tasks and production issues.

Participated in daily status calls with offshore and onsite team members to discuss the status of the project tasks as per the Kanban Board.

Worked on various defects raised in Kanban by the concerned business teams.

Environment: IBM InfoSphere DataStage 9.1, Oracle 11g, Control-M, LINUX Shell Scripting, Putty, WinSCP, Cognos 10.01

Senior Software Engineer

HSBC Bank, Hyderabad, India

Feb 2016 – Mar 2017

The Actimize CTR (Currency Transaction Report) complies with the Bank Secrecy Act of 1970(BSA) requirements to report cash transactions greater than $10,000 for both single and multiple transactions by or on behalf of the same person. In addition to individual transactions exceeding the reporting threshold, the solution detects aggregated daily activity over $10,000 by same account, customer, transactor or a group of related accounts. This is not only limited to exchange of currency and also deals with cash transactions through various forms.

Responsibilities

Involved in all business meetings to understand the existing business logic and the enhancements to the application.

Responsible to create Source to Target Mappings (STM) as per the requirements.

Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer and to load the target tables using the DataStage Designer.

Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Data Capture stage.

Developed DataStage jobs to implement Slowly Changing Dimension phenomenon (SCD) Type1, Type2 and Type3.

Developed parallel jobs using various Development/debug stages (Peek stage, Row generator stage, Column generator stage) and processing stages (Aggregator, Change Data Capture, Filter, Sort, Merge, Funnel and Remove Duplicate Stages).

Designed and developed ETL processes using DataStage designer to load data from Main Frame files, DB2 UDB, Sequential Files (Fixed Width) to staging database and from staging to target Data Warehouse database.

Designed job sequencer to run multiple jobs with dependency and email notifications.

Involved in Unit Testing, SIT and UAT with the users in data validations.

Extensively worked in improving performance of the jobs by avoiding as many transforms as we can.

Prepared documentation for unit, integration and final end - to - end testing.

Worked within the team to make appropriate, effective decisions related to project responsibilities and to initiate and follow-up on assigned tasks without supervision.

Provided support and guidance by creating Release, Deployment & Operation guide documents.

Developed LINUX Shell scripts to run DS jobs.

Created Control-M jobs for scheduling.

Worked on various defects raised in QC by the concerned business teams.

Environment: IBM InfoSphere DataStage 9.1, Oracle 11g, DB2, Control-M, LINUX Shell Scripting, Putty, WinSCP, Cognos 10.01.

Senior Software Engineer

HSBC Bank, Hyderabad, India

Mar 2014 – Jan 2016

This is a migration project. As part of the migration this project involves migrating the DataStage from version 7.5 to 8.5 and Database from Oracle to Teradata.

Responsibilities

Worked on the existing system analysis, data analysis, source-to-target mapping, process flow diagrams and documentation.

Built DataStage ETL interfaces to aggregate, cleanse and migrate data across enterprise-wide MDM ODS and Data Warehousing systems using staged data processing techniques, patterns and best practices.

Implemented full lifecycle management to capture and migrate various DataStage ETL metadata including data mapping and other data integration artifacts (such as schedulers, scripts, etc.,) across environments including the establishment of standards, guidelines and best practices.

Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Data Capture, Oracle and Teradata connectors.

Involved in the analysis of various Teradata database functions to replace functions in Oracle.

Created BTEQ scripts to replace existing Oracle SQL scripts.

Designed and developed ETL processes using DataStage designer to load data from Main Frame files,, Oracle and Sequential Files (Fixed Width) to staging database and from staging to target Data Warehouse database.

Used DataStage stages namely Sequential file, Transformer, Aggregator, Sort, Datasets, Join, Funnel, Row Generator, Remove Duplicates, Copy stages in accomplishing the ETL Coding.

Developed job sequencer with proper job dependencies, job control stages, triggers and used notification activity to send email alerts.

Excessively used DS Director for monitoring Job logs to resolve issues.

Extensively worked in improving performance of the jobs by avoiding as many transforms as possible.

Wrote Shell Scripts to check for the existence of files and to check for the actual rows as against the expected rows and implemented the script Before-job Subroutines.

Used Control-M job scheduler for automating the daily run of DW cycle in both production and UAT environments.

Documented ETL test plans, test cases, test scripts and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Involved in Unit Testing, SIT and UAT Worked with the users in data validations.

Conducted sessions on DataStage technical training at the Business unit level.

Worked within the team to make appropriate, effective decisions related to project responsibilities and to initiate and follow-up on assigned tasks without supervision.

Environment: IBM InfoSphere DataStage 8.5/7.5, Oracle 10g, Teradata 13, Sequential files, Control-M, LINUX Shell Scripting, COGNOS10.01.

Software Engineer

HSBC Bank, Hyderabad, India

May 2012– Feb 2014

The project eBI gives a road view for the business to make new strategy for developing the business. It holds the information regarding customer’s profile and about their transactions made and the type of transaction made like CURRENT or FUTURE or RECURRING and also information about the searches done on internet. It keeps the track of number of people using online services and Survey information.

Responsibilities

Interacted with End user community to understand the business requirements and with Functional Analyst to understand the Functional Requirements identifying data sources and its implementations.

Designed and developed jobs for extracting, transforming, integrating and loading data into data mart using DataStage Designer.

Developed, executed, monitored and validated the ETL DataStage jobs in the DataStage designer and Director Components.

Worked with DataStage Director to schedule, monitor, analyze performance of individual stages and run DataStage jobs.

Extensively used Change Data capture, Transformer, Copy, Join, Funnel, Aggregator, Lookup Stages and development stages to develop the parallel jobs.

Coordinated the code and data movement to production in production implementation activities.

Created UNIX shell scripts to run the batch jobs and sql files.

Designed the draft or spread sheet to schedule the jobs in Control-M.

Involved in day to day production support activities.

Checked if the files are available in the landing directory and if any files missing raised ticket to the concerned team for missing files.

Monitored the production batch cycle to ensure the successful completion.

Investigated on the failures if any and take appropriate action to fix the issue.

Maintained service tracker sheet to track all the failures, root cause of the failures and fixes applied in the production cycle.

Hold the jobs, Force Ok the jobs, ForceRun the jobs and Delete the jobs from schedule in Control-M scheduler depending on the situation.

Involved in the enhancements of the process of the existing production process.

Optimized/Tuned DS jobs for better performance and efficiency.

Worked with DBAs, UNIX system Admins and several teams to fix the production environment issues.

Environment: DataStage 7.5, Oracle 10g, Sequential files, Control-M, UNIX Shell Scripting, COGNOS10.01.

Software Engineer

HSBC Bank, Hyderabad, India

Jul 2010– Apr 2012

Callidus-TrueComp project is formed as GLT arm of NA EIM CoE (North America Enterprise Incentive Management Center Of Excellence) to help NA HSBC team to rollout TrueComp suite of products to various business units of HBUS(HSBC USA) in maintaining their incentive plans. Callidus EIM systems allow enterprises to develop and manage incentive compensation linked to the achievement of strategic business objectives. This project also involves support and maintenance of all live EIM implementations in NA.

Responsibilities

Involved in the Functional Specification Design Analysis.

Prepared Technical Design Document as per the requirements.

Prepared Source to Target Mappings.

Developed Informatica Mappings for Extraction, Transformation and Loading in the Informatica PowerCenter Designer.

Developed Informatica Workflows using mappings in the Informatica Workflow Manager

Performed Unit testing and handled the issues.

Involved in the SIT and UAT phases of the project.

Created Unix Shell scripts for running workflows and for file monitoring, file validations, archiving the files and also purging the files that are older than retention period.

Prepared implementation plan for the migration of Informatica Mappings and workflows to production environment.

Created drafts in Control-M to schedule the batch.

Involved in production code migration and followed by monitoring the production batch.

Involved in the all production support activities.

Investigated on the failures if any and took appropriate action to fix the issue.

Involved in Onsite/offshore coordination.

Maintained service tracker sheet to track all the failures, root cause of the failures and fixes applied in the production cycle.

Hold the jobs, Force Ok the jobs, ForceRun the jobs and delete the jobs from schedule in Control-M scheduler depending on the situation.

Involved in the enhancements of the process of the existing production process.

Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.

Worked with DBAs, Unix system Admins and several teams to fix the production issues.

Environment: Informatica 8.6, Truecomp 5x, Actuate, UNIX Scripting, PERL Scripting, Putty, WinSCP, Oracle 10g, Control-M.



Contact this candidate