Post Job Free

Resume

Sign in

Business Intelligence Data Warehousing

Location:
Lancaster, SC, 29720
Posted:
March 04, 2024

Contact this candidate

Resume:

Subba Vazrala

Work Authorization: US Citizen Preferred Location: Charlotte / Remote

Current Location: Charlotte, NC Email : ad33k8@r.postjobfree.com

Ph: 704-***-****

SKILL MATRIX:

Overall 18+ years

ETL 16+ years

Informatica Powercenter 15+ years

Informatica Power Exchange 3+ Years

Informatica Intelligent Cloud Servers (IICS) 3+ years

Oracle 10+ years

Lead Experience 5+ years

SQL 15+ years

Teradata 5+ years

Unix Shell Scripting 10+ years

Application maintenance 5+ years

Production issues 10+ years

Position Qualifications

Has over 18+ Years of IT experience in analysis, design, development and implementation of Business Intelligence solution using Data Mart/Data Warehousing Design, ETL, OLAP, Client/Server and web applications on Unix and windows platform in various domains such as Banking, Finance, Insurance, and Manufacturing.

18 Years of IT experience in analysis, design, development and implementation of Business Intelligence solution using Data Mart/Data Warehousing Design, ETL, OLAP, Client/Server and web applications on Unix and windows platform in various domains such as Healthcare, Banking, Finance, and Insurance

15 years’ experience in design, develop and maintaining Data Warehouses and Data Marts using Informatica PowerCenter, IICS,IDMC, Teradata, Oracle, DB2, SQL Server.

2 Years of Business Intelligence experience using Business Objects 6.5/5.1(Designer, Business Objects and Migration) and Micro Strategy

Experience with creating High Level Design Document (HLD), Business Requirement Document (BRD) and Functional Specification Document (FSD).

Expertise in Creation, Modification and Testing of Informatica Mappings, Mapplets, Workflows, Worklets and debugging and performance tuning of Informatica Mappings and Sessions.

Experienced in Performance tuning of critical Informatica Power Center jobs & database loader jobs and Data Warehousing Applications.

Expert in creating complex Informatica mappings as per business requirements and handling ETLs to load huge volume of data.

Expert in creating Informatica mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy and Joiner transformations

Proficient in writing and using SQL queries, Stored Procedures, Materialized Views, and Optimization tuning techniques.

Experienced In Replicating data mappings and transformations from PowerCenter mappings to IDMC models.

Extensively worked on Testing the converted IDMC models to ensure that they produce expected results and behave consistently with the original PowerCenter mappings. Validate data integrity, transformation logic, and end-to-end functionality of the converted models against sample data sets and test cases.

Extensively worked on Informatica Designer, Work Flow Manager, Work Flow Monitor and Repository Manager.

Hands on experience in ETL Performance, tuning of sources, targets, transformations and sessions using Informatica PowerCenter

Write High Quality SQL, PL/SQL, and Shell Scripting code based on Business requirements.

Experience in slowly Changing Dimensions (Type 1, Type 2 and Type 3), OLTP, OLAP, Data Reports and Surrogate keys, Normalization/De normalization.

Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads, Incremental Loads, Daily loads and Monthly loads.

Extensively used API modelues and populated portal tables In AWS.

Extensively worked on SOAPUI scripts executed those using API for data manipulations in Portal tables.

Extensively worked on MDM batch jobs to populate data in to Portal tables.

Created generic Interfaces for SFTP files from CDP/ADLS platform using IICS.

Used IICS to pull data from Synapses database to generate Flatfiles and also log Audit entery.

Created reusable DI mappings, DI tasks and MI tasks using IICS.

Experienced in creating entity relational & dimensional relational data models using Kimball/Inmon Meth-odology i.e. Star Schema and Snowflake Schema.

Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load, TPT to export and load data to/from different source systems including flat files.

Validating, TESTING & Debugging ERRORS with in ETL Transformations and Troubleshooting and prob-lem-solving applications with Informatica Debugger.

Expertise in Creation, Modification and Testing of Informatica Mappings, Mapplets, Workflows, Worklets and debugging and performance tuning of Informatica Mappings and Sessions.

Experienced in Performance tuning of critical Informatica jobs & database loader jobs and Data Warehous-ing Applications

Good experience in UNIX Shell scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like CRON tab and Autosys

Exposed to various domains of Data warehouses like Insurance, Finance, Pharmaceutical and Sales

Extensive experience in Offshore and Onsite model. Received good Customer feedback for timely delivery and fulfilled the expectations

Hands on Experience using ITIL model, Supported Level1 & Level2 people on Production Issues and in-volved in tuning Informatica jobs as Performance group member under Level3 team

Professional Experience

Client & Location: : Medica, Charlotte,NC

Position: Informatica Lead developer Jan’2023 – Till date

Project : MDM Member

Roles & Responsibilities:

Provided Architectural road map,directoion and work packets for ETL needs

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Worked on creation or modifications of Informatica mappings using several transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy and Joiner transformations,etc

Developed the Informatica mappings and sessions and workflows for Sourcing.

Created complex Informatica mappings as per business requirements and handling ETLs to load huge volume of data.

Created detailed ETL migration process for Informatica,database,and Scheduling.

Worked on Performance tuning of critical Informatica Power Center jobs.

Used Data Quality processes to Validation, Cleansing and Profiling.

Used AWS Glue to monitor ETL loads and data validations in AWS environment.

Involved into Data Governance, Data cleansing, Data profiling with capabilities of IDQ.

Source data profiling and analysis against established Data Quality (DQ) Used Data Inventory to provide access for products and maintenance.

Used MDM Hub Match and Merge rules Batch jobs and Batch groups.

Used API modelues and populated portal tables In AWS.

Worked on SOAPUI scripts executed those using API for data manipulations in Portal tables.

Worked on MDM batch jobs to populate data in to Portal tables.

Prepared ETL spec for Offshore Development and ensured quality ETL deliverables.

Created Common objects for the ETL team to maintain the Common Code and Documented standards for Informatica code development

Performed Unit Testing and Integration Testing for cloud based Applications.

Environment:InformaticaPowerCenter,OnPremiseServer,Oracle,UNIX,IDQ,MDM360,AWS,AWS Glue,Autosys

Client & Location: Bank of America, Charlotte,NC

Position: Senior ETL developer Oct’2021 – Dec’2022

SDAR project

Roles & Responsibilities:

Provided Architectural road map, direction and work packets for ETL needs

Created templates with ETL standards for Design, Development, Release and Production Support.

Prepared ETL spec for Offshore Development and ensured quality ETL deliverables.

Created Common objects for the ETL team to maintain the Common Code and Documented standards for Informatica code development

Created detailed ETL migration process for Informatica,database,and Scheduling.

Created several complex Informatica mappings as per business requirements and handled ETLs to load huge volume of data.

Identified and created the necessary indexes on the tables.

Provided support In SDAR Inbound and Outbound processes under Risk group space.

Created generic Interfaces for SFTP files from OnPremise staging server on to Hedge nodes using IICS

Involed In Replicating data mappings and transformations from PowerCenter mappings to IDMC models.

Worked on Testing the converted IDMC models to ensure that they produce expected results and behave consistently with the original PowerCenter mappings. Validate data integrity, transformation logic, and end-to-end functionality of the converted models against sample data sets and test cases.

Writen High Quality SQL, PL/SQL, and Shell Scripting code

Crated and worked on CRQ Items for Code fixes & data fixes and also migrations.

Used uDeploy for DB migrations and Gihub for Code versions.

Performed Unit Testing and Integration Testing for cloud based Applications.

Involved in Code Reviews & suggested best practices and presented Informatica new features in USER Group meetings.

Environment: Informatica Power Center 10.5,IDMC, OnPremise Server,SQL Server, UNIX, GitHub,uDeploy,Autosys

Wells Fargo, Charlotte, NC Apr’2011 – Aug’2021

Application Systems Engineer Lvl5 (Consultant)

I had handled the following projects for various customers during my tenure at Wells Fargo

WDM (Wholesale Data Mart)

WCDIR (Wholesale Credit Data Infrastructure and Reporting)

ICA (Integrated Compensation Application)

ICA Next paradigm

Quatifacts

MDM

Design and development of ETL application to load data in data warehouse and data mart using Informati-ca, Teradata Bteq’s, Macros and load utilities along with TPT.

Created and analyzed complex Business Requirement Documents (BRD) and Functional requirement doc-uments to develop the new ETL mappings.

Created the High Level Design, Functional Specification Document, Source to Target Mapping, Unit test scripts and production support documents.

Reviewed the Design with Business Architect, Business Analyst for Design approval.

Developed Informatica mappings, sessions and workflows for Sourcing.

Used IICS to source data from various sources like Cloud/ADLS/CDS platforms.

Created detailed ETL migration process for Informatica,database,and Scheduling.

Created several complex Informatica mappings as per business requirements and handled ETLs to load huge volume of data.

Performance tuned sources, Targets, mappings and SQL queries in transformations

Extensively used performance techniques while loading data In to Azure Synapses database using IICS.

Used Informatica to load data from Flat Files to DB2, from Oracle10g database/ SQL server to DB2.

Created ETL processes using Webservice consumer transformation for Invoking SAS dataflex jobs.

Created Informatica mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy and Joiner transformations

Created several complex Informatica mappings as per business requirements and handled ETLs to load huge volume of data.

Created reusable transformations and Mapplets and used them in complex mappings

Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables

Performed the Unit Testing and captured the Unit test results.

Mentoring the team for ETL project execution, design and technical Issues.

Provided post-implementation support including defect correction, enhancements to the existing applications and production support on a rotational basis.

Environment: IICS, Informatica power center,MS SQL Server, SSIS,Webservices,Azure,SAS,DataFlex,DB2, Oracle 10g, Teradata, Aginity 4.2, Toad Data Point 4.0, AutoSys UNIX Shell Scripting, Agile Methodology, JI-RA, Version One, Winscp, Tortoise SVN 1.7.5,GitHub,uDeploy

Bank of America, Charlotte, NC Jun’10 – Apr’11

Sr Informatica/Database Developer (Consultant)

MDM – Mortgage Data Mart

Project Scope: This project captures the life of the mortgage loan (i.e starting from the application date to closed date) and integrates bank of America, country wide and Merrill lynch data. This helps functional users to calculate secondary market value for the loan.

MDM manages Whole Loan Data Repository and supports the following business functions

Client & Location: MassMutual Financial Group, MA

Position: Sr ETL Developer Nov’09 – Jun’10

Statement Architecture – Informatica Real time.

Project Scope: MassMutual Financial Group is a Fortune 100 company which mainly deals with Investments, Insurance and Retirement Services. The Statement Architecture Project is a strategic method to provide a Quar-terly Benefit Statement.

Client & Location: MassMutual Financial Group, MA

Position: Sr Informatica Developer Mar’09 - Oct’09

Customer Statement Mart (Retirement Services)

Project Scope: This project will follow the strategic architecture of moving source system data into the RSS Information Facto-ry, from where a new data mart is populated (Customer Statement Mart or CSM). This mart will be exclusively used for participant statements and will provide a base for other types of statements also going forward.

Client & Location: MassMutual Financial Group, MA

Position: Sr Informatica Developer / Off-shore Mentor Jul’06 - Mar’09

Life Information Factory (LIF)

Project Scope: MassMutual Financial Group is a Fortune 100 company which mainly deals with Investments, Insurance and Retirement Services. The USIG Division of MassMutual is implementing two of its large pro-grams DSP and LSP under the Legacy Modernization umbrella. The Life Information Factory will be the enter-prise level data warehouse. The purpose of the project LIF is to take advantage of the new admin system for better management and timely administration of product lines. The current systems will be replaced by new SAP system. The current systems will be retired as each product is moved to the new platform. The conversion will take place in four phases.

Education:

Master of Computer Applications from Periyar University, India: May’98 – May’01

Bachelor’s degree in Computer Science from Nagarjuna University, India: Jun’93 – Mar’96



Contact this candidate