Post Job Free

Resume

Sign in

Developer Data

Location:
Uzhhorod, Zakarpats'ka oblast, Ukraine
Posted:
March 13, 2017

Contact this candidate

Resume:

Sooraj Reddy Lankala

Email: acy9xr@r.postjobfree.com Phone: +91-893*******

Experience Summary

5.6 years of strong experience in working with large scale Enterprise Data Warehouse implementations using Informatica PowerCenter 9.x/8.x/7.x, Oracle, DB2, SQL Server on UNIX platforms.

Have 5.4 years of strong experience in Oracle PL/SQL. Have been certified in Oracle SQL and PL\SQL Certified Developer.

Have strong experience in Insurance domain and have been certified in AINS21 (Property and Liability Insurance Principles) and AINS23 (Commercial Insurance).

Strong knowledge in OLAP systems and Dimensional modelling using Star and Snowflake schema.

Extensive experience in Extraction, Transformation, and Loading (ETL) data from various heterogeneous data sources into Data Warehouse and Data Marts using Informatica PowerCenter tools (Repository Manager, Designer, Workflow Manager and Workflow Monitor).

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica PowerCenter.

Experience in performance tuning of Informatica mappings and sessions to improve performance of the large volume projects and also worked on SQL and database object level performance tuning.

Experience in Migration and Configuration of Informatica PowerCenter.

Experience in integration of various data sources like Oracle, DB2, SQL Server, Flat Files, Mainframes files into Data Warehouse and also experienced in Data Cleansing and Data Analysis.

Excellent expertise with different types of data load strategies and scenarios like Historical Dimensions, Surrogate keys etc.

Worked extensively in all stages of SDLC from gathering requirements to testing, implementation and support.

Experience in preparing documentation such as High level design, Low level design and requirement Specification document.

Strong experience in writing UNIX Shell scripts, SQL Scripts for development, automation of ETL process, error handling and auditing purposes. Experience in using Control-M scheduling tool to organize and schedule jobs.

Good knowledge on generating reports using Business Objects.

Worked with cross-functional teams such as QA, DBA and Environment teams to deploy code from development to QA and Production server.

Debugging defects raised in various phases and providing root cause analysis for issues in ETL

Excellent analytical, problem solving skills with strong technical background and interpersonal skills.

Ability to work with a team distributed over locations, capable of meeting stringent.

Contributions/Achievements

Won best performer of the project for the dedication towards the work and timely delivery of the deliverables with zero UAT and post production defects

Won best team of the month award for our great team effort in delivering the deliverables with zero percentage slippage, zero UAT and post production defects.

Received number of customer appreciations through emails for providing best practices/value adds and completing the work on time and providing suggestions for the identified issues.

Created an automation tool to create simple ETL mappings using Microsoft Excel macros.

Technology

Software

Informatica Power Center 8.6.1 & 9.1.0, Informatica Power Exchange 5.5, Oracle database 10g, 11g, Netezza

Tools

Oracle SQL Developer, Toad, SQL*PLUS, Winscp, SecureCRT, Control-M

Career Profile

Dates

Organization

Designation

Mar 2011 – Sep 2016

Tata Consultancy Services Ltd.

IT-Analyst

Dates

Organization

Designation

Oct 2016 – Till date

Deloitte Consulting Private Ltd

Consultant

Qualifications

Course

Specialization

University / Institution

B.Tech

ECE

Jawaharlal Nehru Technological University – JNTU

Assignments

The details of the assignment(s) that I have handled

Project # I:

Project

QCT - BI

Customer

Qualcomm

Period

October 2015 – Sep 2016

Description

The Project Qualcomm, based in San Diego, CA 92121, USA leads the world in the development of CDMA2000 ((1X and 1x EV-DO) and WCDMA (UMTS) the two most widely accepted International Telecommunications Union-approved 3G standards) which helps in creating state-of-the-art 3G chipsets, system software, development tools and products. These technology solutions are responsible for enabling many of the world's latest 3G handsets and wireless devices that provide users with powerful features such as multimedia, position location, connectivity, security and content-rich applications based on the BREW™ system and Java™-based technologies.

QCT i.e. Qualcomm CDMA Technologies is the world’s largest provider of wireless chipset technology.

QCT – BI Project involves in designing of a Business Intelligence System based on Logical Model. Data from Operational Source Systems is extracted, transformed and loaded into Data warehouse using Informatica 9.x.Control M is used for monitoring the and scheduling the daily ETL loads. SAP Business Objects and Qlikview are the reporting tools used for reporting and analyzing. The project team consists of six members.

Role

ETL Designer & Developer

Database

Oracle 11g, Netezza

Tools

Informatica Power Center 9.1, Oracle SQL Developer, Toad, SQL*PLUS, Control-M 9, HP Quality Center, Kintana

Responsibilities

Closely worked with the Business Analyst to understand the business requirements

Analyzing the production issues and providing the root cause and solution.

Responsible in code review, QA activities and Validation of ETL workflows in Informatica.

Extensively used PL/SQL developer and Toad in data validation

Co-ordinate with the Off shore team and report any issue in loading and extraction of data from different source systems

Responsible for design, development and enhancement of the Informatica mappings and mapplets.

Responsible for creating mapping design document and ETL Unit test cases for the mapping and executing the test cases created by other team members.

Responsible for tuning the ETL code, re apply the logic using different set of transformations, using SQL overrides, partitions, increasing block size, data cache size and target based commit interval.

Responsible for creating the Data mod scripts to fix the bugs, as well as update the data for enhanced or modified tables.

Worked on the creation of SQL and repository Queries used to generate the data Quality Alerts

Responsible for creating the Deployment groups and code migration from development (DEV/DV2) to test (TST/TS2) environments.

Responsible to add Task i.e. workflows and create Jobs in Control M and Monitoring the Non-Prod jobs through Control M

Project # II:

Project

Merlin Management View - Merlin MV

Customer

Continental Casualty Company(CNA)

Period

August 2013 – October 2015

Description

Merlin MV project is an enterprise data warehouse consisting of datamarts for each line of business. It is designed to give decision makers access to key insurance performance metrics through Dashboard’s, Standard Reports and Adhoc Reporting Universes using SAP BO. The data is sourced from several relational sources, flat files and mainframe datasets.

Role

ETL Designer & Developer

Database

Oracle

Tools

Informatica Power Center 8.6, 9.1, Informatica Power Exchange, Oracle SQL Developer, Toad, SQL*PLUS, Winscp, SecureCRT, Control-M

Responsibilities

Technical Lead for development team of 6 people and successfully delivered all the projects with no post production defects.

Interacting with Onsite and SME’s to understand the business requirements.

Providing Estimates – Post Concept/Post Definition for all the requirements.

Working on Requirement gathering, analysis and preparation of High level Design and detail design.

Conducted data load, data cleansing and implemented business rules using various transformations in Informatica Power Center Designer. Few of them are mentioned below:

Source Qualifier

Expression

Joiner

Aggregator

Update Strategy

Normalizer

Lookup

Sequence Generator

Stored Procedure

Worked on the creation of re-usable transformations and mapplets along with the use of shared ETL folders (Sources and Targets)

Worked with various Relational sources like SQL Server, Sybase and Mainframe datasets

Worked on creating data maps for Mainframe datasets using Informatica Power Exchange

Worked on the creation of datamarts with SCD Type 2 implementation using dynamic cache lookup transformation or using Checksum method (MD5) with Relational and flat file sources(delimited, fixed width)

Worked on data profiling to cleanse and standardize the source data using pre-defined regular expression functions available in Informatica Power Center designer

Worked on performance tuning of ETL mappings involving the below listed transformations

Source Qualifier

Joiner

Aggregator

Lookup

Update Strategy

Worked on performance tuning of ETL sessions using below session properties and options

Enabling Bulk load

Using different Pipeline partitioning methods

Auto memory attributes

DTM Buffer Cache

Commit interval

Log trace level

Push Down Optimization

Performed Code review of complex ETL mappings involving various transformations that were mentioned above.

Worked on creating and triggering UNIX shell scripts using Session (pre & post properties) and command tasks

Worked on the creation of UNIX shell script to automate the execution of ETL workflows using PMCMD command line utility.

Worked on the creation of UNIX shell scripts to automate the file transfer process using SFTP, SCP and FTP protocols.

Worked on encrypting/ decrypting (PGP and GPG) flat files and performing data reconciliation and pre-processing using UNIX utilities like SED, AWK.

Worked on performance tuning at SQL and database object level using below methods

SQL Tuning using Indexes(Bitmap, B-Tree and Function based)

SQL Hints

Analyze Table statistics with different sampling

Partitioning Pruning – Range, Hash and Range-Hash partitioning methods

Histograms

Using TKPROF, SQL Tuning Advisor, AWR and ADDM reports.

Testing the code for all possible scenarios.

Analyzing the code for the defects raised by Business.

Resolving the defects raised by Business.

Project # III:

Project

Enterprise Client File - ECF

Customer

Continental Casualty Company(CNA)

Period

February 2013 – August 2013

Description

ECF is an enterprise program upgradation. The database and ETL versions are upgraded to higher versions to meet enterprise client requirements. It enables business users with enhanced functionalities.

Role

ETL Designer & Developer

Database

Oracle

Tools

Informatica PowerCenter 9.1, Oracle SQL Developer, Toad, SQL*PLUS, Winscp, Secure-CRT, Control-M

Responsibilities

Technical Lead for development team of 4 people for data warehouse applications.

Interacting with Onsite and SME’s to understand the business requirements.

Providing Estimates – Post Concept/Post Definition for all the requirements.

Working on Requirement gathering, analysis and preparation of High level Design and detail design.

Coordinating with the Infrastructure support team for the migration of ETL mappings from Informatica 8.6 to 9.1 version to support Oracle 12c database.

Worked on the creation of new environment setup to support the ETL process which involves

Creation of new shared directory, Unix group and users for Merlin MV application

Creation of new PMCMD user and ETL connectors for source database

Creation of Unix shell scripts to execute ETL workflows and to perform database operations like truncate, analyze, disable and drop indexes/tables

Successfully execute the ETL jobs from Control-M scheduler and perform regression test to ensure data consistency

Perform dry run in production to avoid environment setup issue before the actual deployment.

Project # IV:

Project

Catastrophe Management - CATMAN

Customer

Continental Casualty Company(CNA)

Period

March 2011 – February 2013

Description

CATMAN is an enterprise catastrophe management reporting application which enables Actuaries and Underwriters to analyse different catastrophic risks for better pricing of different insurance policies. It also helps cat modellers to assess the risk in a portfolio of exposures.

Role

ETL Designer & Developer

Database

Oracle

Tools

Informatica Power Center 8.6, Oracle SQL Developer, Toad, SQL*PLUS, Winscp, Secure-CRT, Control-M

Responsibilities

Have worked on all requirements related to CATMAN.

Interacting directly with customers on all the business requirements.

Providing Estimates – Post Concept/Post Definition for all the requirements.

Working on Requirement gathering, analysis and preparation of High level Design and detail design.

Worked with SME's and UAT users to design and test the requirements.

Worked on the addition of new catastrophic risk FLOOD to the property package policies and integrating it the existing perils undergoing all the existing business rules and validations

Worked on integrating the standardized the Geo-code information from an external vendor

Worked on standardizing addresses and Geo-code information using regular expressions to provide consistent risk location information to catastrophic modelers.

Worked on performance running CATMAN ETL batch process using concurrent Workflow execution and SQL and Informatica mapping/session tuning.



Contact this candidate