Post Job Free

Resume

Sign in

Sr. ETL Developer

Location:
Orlando, FL
Posted:
December 15, 2023

Contact this candidate

Resume:

Sr. ETL/Datawarehouse Developer

Manoj Yakamuru

Phone: 213-***-****

Email: ad1zsc@r.postjobfree.com

Summary:

Over 7 Years of experience in Analysis, Design, Development, Testing, Implementation, Enhancement and Support of BI applications which includes strong experience in Data Warehousing (ETL & OLAP) environment as a Data Warehouse Consultant.

Proficiency in utilizing ETL tool Informatica Power Center 10.x/9.x/8.x, Informatica B2B DT Studio for developing the Data warehouse loads with work experience focused in Data Integration as per client requirement.

Data integration with SFDC CRM using Informatica cloud (IICS)

Expertise in designing ETL Architecture involving databases Teradata, Oracle, DB2, My Sql, SQL Server, DB2 and Flat Files (fixed width, delimited), XML.

Having knowledge of Dimensional Modeling, Star and Snowflake schema.

Loaded Fact and Dimension Tables as per the reporting requirements and ease of future enhancements.

Knowledge in Data Flow Diagrams, Process Models, and ER diagrams with modeling tools like ERWIN & VISIO.

Expertise in working with relational databases such as Oracle 12c/11g/10g/9i, SQL Server 2005/2008/2017, DB2, Teradata 15/14/13/12 and MySQL.

Extensive experience in developing Stored Procedures, Functions, Views and Triggers and Complex SQL.

Snowflake modeling – roles, databases, schema

Good experience with Snowflake utility SnowSQL.

Experience in analyzing and developing Informatica B2B data transformation projects for processing unstructured and semi-structured data file formats like XML, Excel, JSON, Text formats.

Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations.

Extensive experience in designing and developing complex mappings applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter, joiner and sorter transformations and Mapplets.

Extensive experience in developing the Workflows, Worklets, Sessions, Mappings, and configuring.

Proficient in delivering the high data quality by designing, developing and automation of audit process and implementation the reconcile process accordingly.

Excellent knowledge in identifying performance bottlenecks and also in tuning the Mappings and Sessions by implementing various techniques like partitioning techniques.

Develop integration processes using Informatica’s Intelligent Cloud Services (IICS) and Informatica PowerCenter using best practices around data movement including things like audit, balance and control, Change Data Capture, and error handling.

Developed Teradata Loading and Unloading utilities like Fast Export, Fast Load, Multi Load, T-Pump & TPT.

Extensive knowledge with Teradata SQL Assistant.

Developed BTEQ scripts to Load data from Teradata Staging area to Data Warehouse, Data Warehouse to data marts for specific reporting requirements.

Tuned the existing BTEQ script to enhance performance using Explain Plan.

Experience in MongoDB document-based data system and non-relational data modeling and CRUD operations.

Extensive Knowledge on Indexes, Collect Stats & identifying the skew ness.

Experience in UNIX shell scripting, FTP and file management in various UNIX environments.

Strong understanding of Data warehouse project development life cycle. Expertise in documenting all the phases of DWH projects.

Expertise in ETL Production Support both in Monitoring & fixing the failures also fixing data issue raised by Customers in reporting layer.

Worked with roster mechanism for Production Support on rotational basis.

Handled different severity tickets related to production support.

Worked with Service Now & BMC Remedy trouble ticket management tools.

Worked in Both Waterfall & Agile Methodologies.

Expertise with scrum meeting, spring planning & using Agile tools Rally & Jira.

Excellent team player and self-starter with good ability to work independently and possess good analytical, problem solving and logical skills.

Strong experience in data migration from RDBMS to snowflake cloud data warehouse.

Led a migration project from Teradata to snowflake warehouse to meet customer needs.

Technical Skill Set:

Data Warehousing

Informatica Power Center 10.1/9.6.1/9.5.1/9.x/9.1/8.6.1, SSIS 2005/2017, Informatica cloud (IICS), SAP Data Services, Talend

Databases

Oracle 12c/11g/10g/9i, SFDC, Sybase 12.0, DB2, MySQL, Vertica, MongoDB, Teradata 15/14/14.10/13/12 (Fast-Load, Multi-Load, T-Pump, TPT and BTEQ), SqlServer

BI Tools

Cognos, Business Objects, Tableau

Data Modeling

ERWIN

Job Scheduling

CA Autosys, Tidal(TES), Control M, Tivoli(TWS), Cron tab, Informatica Scheduler, ESP

Programming

Unix Shell Scripting, XML, SQL and PL/SQL

Environment

Windows, Unix, Linux

Other Tools

SQL Server Management Studio, Toad, Win SQL, Sql Developer, Putty, WinScp, Jira, Informatica B2B DT, DB visualizer, Github, Jenkins, Snowflake, Informatica EDC

Professional Experience

Reinsurance Group of America, St. Louis, Missouri Apr 2021 – Present

Sr. Datawarehouse Developer

Reinsurance Group of America (RGA), Incorporated is a leader in the global life and health reinsurance industry, working to make financial protection accessible to all. RGA serves clients from operations in 26 markets around the world, delivering expert solutions in individual life reinsurance, individual living benefits reinsurance, group reinsurance, financial solutions, facultative underwriting, and product development.

Responsibilities:

Co-ordinated and led the interactions across Portfolio Architects, Enterprise Data Modeling Architects, DBA and Business teams in finalizing Business requirements, ETL architecture and data modeling and translate the requirements into technical specifications.

Worked on Technical documentation like preparing High Level Design Documents, Low Level Design Documents and Source to Target mapping sheets.

Involved in the Logical and physical data modeling and prepared the Source data volumetric information with DBA to plan Indexes, table spaces, hashing and partition keys.

Preparation and Review of Unit Test cases and executing Unit test cases in QC and documenting the results.

Managing and guiding the team in design and development the ETL artifacts.

Reviewing the Code developed by the team members and provided feedback.

Involved in test data preparation for SIT, UAT phases.

Participating in Reviews with the customers to get the sign off on design and mapping documents.

Involved in technical supporting, troubleshooting data warehouse application to meet Service Level Agreement (SLA's)

Worked on Flexible mapping concept using Informatica DT to intake any format layout driven through the Excel, Switch codes dynamically based on layout of file.

Creation of B2B DX partners, profiles, workflows, endpoints for onboarding clients.

Worked on Split process, Merge process of multi-client files using Informatica B2B Data transformation and Informatica B2B Data Exchange.

Designed/Developed IDQ reusable mappings to match accounting data based on demographic information.

Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.

Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them.

Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.

Imported the mappings developed in data quality (IDQ) to Informatica designer.

Worked on Informatica Analyst Tool IDQ, to get score cards report for data issues. Extensively worked on Performance Tuning of ETL Procedures and processes.

Handling of CR's (Change Requests) and enhancements for existing application and followed change management process.

Created Informatica mappings using Source qualifier, connected/Unconnected Lookup, joiner, Sorter, Aggregator, Union, and Router transformations to extract, transform and load the data to Staging, ODS, Data warehouse and finally data mart area to enable Reporting capabilities.

Created persistent look up caches to avoid building the same cache multiple times wherever applicable there by improving the performance of the ETL jobs.

Created re-usable transformations/mapplet and used across various mappings.

Created re-usable user defined functions to perform business validations and used across various Projects within the portfolio.

Worked on Slowly Changing Dimensions as per business requirement.

Designed and developed Audit, Balancing and Control process.

Worked on Performance Tuning of sources, targets, mappings, transformations, and sessions, by implementing various techniques like partitioning techniques and pushdown optimization and identifying performance bottlenecks from logs.

Implemented Informatica concurrent workflow execution to improve the performance.

Wrote UNIX Korn shell scripts for file transfer/archiving/Email notifications.

Tuned SQL queries at source qualifier and Lookup transformation to cache required volume of data by avoiding unwanted data wherever possible to minimize the data transfer over the network.

Guiding and Supporting during Assembly/Integration Test, QA and UAT in progress.

Has done the Data validation and error analysis and Provide resolutions / feedback to data experts for the data to be corrected.

Provide knowledge transfer and prepared KT documents to Production Support for ongoing maintenance of the code.

Worked with the Scheduling/Operations teams for scheduling the ETL jobs on Tidal.

Experience working with Informatica EDC for end to end data analysis to get better understanding of the legacy systems.

Environment: Informatica 10.5, Informatica Data Quality (IDQ), Informatica B2B DT, Informatica Analyst, Oracle, SQL Server, JIRA, GitHub, Jenkins, DB Visualizer, UNIX, Toad DB2, Putty, Tidal, Microsoft Team Foundation Server, Microsoft Azure DevOps, Service Now, JIRA, Windows, SLACK, Lucid charts.

City National Bank, Los Angeles, CA Jun 2018 – Mar 2021

Informatica Developer/Data Analyst & Production Support Engineer

City National Bank (CNB) is a bank holding company headquartered at City National Plaza in Los Angeles, California. CNB is a subsidiary of the Toronto-based Royal Bank of Canada and it is the 37th largest bank in the United States as of March 31, 2018. The bank had total assets of $51.1 billion (as of January 31, 2019). It offers a full complement of banking, trust and investment services through 71 offices.

Responsibilities:

Involved in gathering requirements from business users and analysts from the beginning to perform requirement analysis.

Involved in all the phases of SDLC: Design, Development, Testing, Production Support.

Loading the Source Data from Different Data bases to Target Databases and Involved in the design and development.

Development of complex ETL mappings and stored procedures in an optimized manner.

Extensively worked with Erwin Data modeler for Physical and Conceptual data modeling.

Developed ETL Process Architect for the Credit card Data mart Load Process.

Regularly worked with Business and data modeler to discuss the requirements and possible Impact on current load.

Identifying the ETL design pattern to remove EDW deficiencies.

Developed ETL Load process for Data Mart batch processing.

Extensively used SCD Type -I & II in Informatica Mapping, SQL Scripting as well.

Used Workflow Manager to create sessions, workflows and batches to run the Informatica mappings.

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Regularly used Router, Joiner, Union, Filter, and Dynamic Lookups etc. transformations in mapping.

Deployed/Scheduled Python scripts to Production environment

Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the Informatica mapping.

Created oracle stored procedure, functions and packages for data migration and stage data load.

Created indexes on tables, table Partitions for faster retrieval of the Credit Card data and to enhance database performance.

Extensively worked as Production Support both Primary & Secondary & available 24x7 for resolving any ETL Load Issues.

Loaded Data from various Salesforce.com (SFDC) objects into Oracle database

Used SAP BODS, Oracle 11g 2008 extensively for Data Conversion/Migration tasks

Configuring and loading data from Flat file to SAP BI using BODS

Developed the workflows to load the data into DW from legacy systems.

Engaged in application design and schema less data modelling for legacy systems in Mongo DB.

Data Migration - Used techniques like De-duplication for Customer Data, Pattern matching, String

matching, Validate transform, Domain value check.

Created the Local Repositories, Central Repositories and configured the Local with Job server and Central Management console (CMC).

Developed jobs to load the SAP R/3 tables into staging area and developed transformations that

apply the business rules given by the client and loaded the data into the target database.

Developed mappings (IICS) where data from Oracle data sources is updated to the respective SFDC objects.

Created Web service source and target in mapping designer and published Web services

Created mappings using Informatica Cloud transformations Source and target, Data mask,

Expression, Filter, Hierarchy Builder, Hierarchy parser, joiner, lookup, Router, sequence Generator,

sorter to load the data from Dynamics CRM to Databases and vice versa.

Implemented change data capture (CDC) using informatica PowerCenter for oracle and SAP systems.

Working on POC's to come up with possibilities on building code as part of migration from Web Methods to Informatica DT.

Implemented Informatica B2B DT suit projects for converting any input unstructured data format into required format using B2B DT parser, Mapper, Serializer components and execute them in Informatica PC workflows.

Used ESP Scheduling tool to Schedule the jobs.

Used Microsoft Azure DevOps/Jira for Defect tracking and project management.

Extensively used Informatica Development groups for code deployment.

Involved in the Responsible for managing, scheduling and monitoring the workflow sessions

Migrated DB objects from oracle to Snowflake cloud Datawarehouse.

Environment:

Informatica PowerCenter 10.2/10.1, Informatica B2B DT, Informatica B2B Data Exchange, Informatica Cloud(IICS), Oracle 11g/12c, MS SQL Server, My SQL, Flat Files, ESP Scheduler, SAP Business Objects Data Services, Mongo DB 3.6/4.0, Microsoft Team Foundation Server, Microsoft Azure DevOps, Service Now, JIRA, Windows, Unix.

Goldman Sachs & Co, Jersey City, NJ Jan 2017 – Jun 2018

ETL/Teradata Developer

The Goldman Sachs Group, Inc. is leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments, and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world.

The Project Involves Creating & Supporting EDW in Teradata.

Responsibilities:

Involved in Requirement gathering along with Business Analyst, business Analysis, Design, and Development, testing and implementation of business rules.

Translate customer requirements into formal requirements and design documents.

Development of scripts for loading the data into the base tables in EDW using Fast Load, Multi Load and BTEQ utilities of Teradata.

Writing Multi Load scripts, Fast Load and Bteq scripts for loading the data into stage tables and then process into BID.

Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, Fast Load and Multi Load.

Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.

Developed mappings in Informatica to load the data from various sources using different transformations like Look up (connected and unconnected), Normalizer, Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank and Router transformations.

Implemented Type 1 & Type 2 Slowly Changing Dimensions.

Used Target Load Plan in Informatica Effectively.

Used Mapping Parameters to parameterize the objects for migration.

Created Different Tasks (Session, Command, Email, Event Wait, File Watcher) in Informatica for various purposes.

Implemented CDC using mapping variable concept.

Developing and reviewing Detail Design Document and Technical specification docs for end to end ETL process flow for each source systems.

Involved in Unit Testing and Preparing test cases.

Modification of views on Databases, Performance Tuning and Workload Management.

Maintenance of Access Rights and Role Rights, Priority Scheduling, Dynamic Workload Manager, Database Query Log, Database Administrator, Partitioned Primary Index (PPI), Multi-Value Compression Analysis, Usage Collections and Reporting of Re-Usage, Amp Usage, Security Administration Setup etc. and leading a team of developers for workaround with different users for different, complicated and technical issues.

Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BTEQ scripts using Teradata.

Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.

Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.

Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.

Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…

Environment:

Informatica 10.1/9.6.1, Teradata 15/14, Oracle, My Sql, Flat Files, Teradata SQL Assistant, BTEQ, MLOAD, TPUMP, FASTLOAD, FASTEXPORT, TPT, Service Now, SVN, Rally, Windows, Unix.

Education:

Masters in Computer Science from Texas A&M University

Bachelors in Computer Science Engineering from JNTUK(India)



Contact this candidate