Post Job Free

Resume

Sign in

Sr ETL/ Informatica Developer

Location:
Dallas, TX, 75287
Posted:
October 17, 2023

Contact this candidate

Resume:

YUGESH RAVI

Sr ETL / Informatica Developer

Email: ad0fj8@r.postjobfree.com

Contact: 469-***-****

Professional Experience:

Around 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center 10.x/9.x/8.x/7.x/6.x, Power Exchange, Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.

Experience in writing stored procedures and functions.

Experience in SQL tuning using Hints, Materialized Views.

Experience in developing Actimize flows and execution plans and configuring AIS instances and RCM repositories.

Experience in replication technologies Golden Gate

Done integration of the AXON with Informatica EIC/EDC. Achieved Data Governance with Informatica AXON. Achieved Lineage for DQ and impact assessment via AXON.

Lead the Team in driving the development of IFRS Business Rule Development in EDQ

Tuned Complex SQL queries by looking Explain plan of the SQL.

Experience in UNIX shell programming.

Expertise in Teradata RDBMS using Fast load, Multiload, Tpump, Fast export, Multiload, Teradata SQL Assistant and BTEQ utilities.

Extensive experience with T-SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views, User Profiles, Data Dictionaries and Data Integrity.

Extensive knowledge in handling slowly changing dimensions (SCD) type 1/2/3.

Experience in using Informatica to populate the data into Teradata DWH.

Experience in understanding Fact Tables, Dimension Tables, Summary Tables

Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloud watch, Cloud Front, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.

Good knowledge in Azure Data Factory, SQL Server 2018 Management Studio

Migrated existing development from servers onto into Microsoft Azure, a cloud-based service.

Moved the company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.

Technical Skills:

ETL

Informatica Developer 10.x, Informatica 10.x/9.x/8.x/7.x, Data Stage 7.5/6.0, SSIS

Data Governance

Informatica EIC / EDC 10.2 with AXON, AnlaytixDS

CLOUD SERVICES

amazon web services, Azure Data Engineering

Databases

Oracle, DB2, Teradata, SQL Server

Reporting Tools

Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X

Database Modelling

Erwin 4.0, Rational rose

Languages

SQL, COBOL, PL/SQL, JAVA, C

Web Tools

HTML, XML, JavaScript, Servlets, EJB

OS

Windows NT/2000/2003/7, UNIX, Linux, AIX

Education:

BE in Mechanical Engineering, Anna University 2015

MS in Information Technology and Management, The University of Texas at Dallas 2020

Client: BECU, Seattle Jul 2021- Present

Role: Sr ETL/ Informatica Developer

Responsibilities:

Proficiently optimized SQL, ETL, and various processes to enhance performance.

Crafted intricate mappings, employing transformations such as lookup, expression, update, sequence generator, aggregator, router, and stored procedure to implement complex logic.

Conducted requirement analysis and designed ETL processes for extracting data from mainframes.

Leveraged Informatica Developer 10.4.1 to develop mapping solutions.

Created and managed ETL mappings to extract data from source systems.

Developed a middle adapter bridging business application and Actimize for risk score calculation.

Automated Actimize Installer using shell scripts.

Generated data mapping documents, aligning source system data with the Actimize Model database.

Utilized C# and SQL to construct a multifactor adapter for Actimize communication.

Implemented both classic Golden Gate replication and integrated replication.

Configured Golden Gate high availability with Oracle DB.

Set up and maintained Golden Gate on 10g Databases.

Addressed defects to ensure data consistency across the database.

Designed, developed, and tested stored procedures, views, and complex queries for report distribution.

Employed Bulk Copy Program (BCP) to extract data from different servers for loading into landing and staging layers.

Managed testing, modification, debugging, documentation, and implementation of Informatica mappings.

Participated in the migration and conversion of ETL processes from development to production.

Extracted, transformed, and loaded data from SQL Server sources according to business requirements.

Utilized Azure Data Engineering for data propagation across various environments.

Employed Azure DevOps for building and deploying Informatica and SQL jobs.

Debugged Informatica mappings and tested stored procedures and functions.

Developed mapplets, reusable transformations, source and target definitions, and mappings using Informatica 10.4.1.

Generated SQL queries to verify data consistency and align tables with business requirements.

Executed Informatica scheduled jobs in a UNIX environment.

Engaged in performance tuning of Informatica mappings.

Successfully migrated ETL workflows, mappings, and sessions to QA and production environments.

Worked on SQL Server jobs to address defects and maintain data consistency throughout the databases.

Demonstrated a strong understanding of source-to-target data mapping and associated ETL business rules.

Utilized Python scripts to extract data from various servers using BCP utilities.

Monitored Informatica jobs using Microsoft Orchestrator.

Environment: informatica 10.4.1, Actimize AIS 4.21, Azure DevOps, Microsoft SQL Server, BCP, Lean kit, Python, Microsoft Orchestrator

Client: Verizon, Piscataway New Jersey Jan 2020- Jun 2021

Role: Sr ETL/ Informatica Developer

Description: Application is used for tracking & getting the latest status of order records. The application provides a one stop shop for Implementation Managers to track their customer’s order progress from order submission to installation. This application provides the latest data for pending orders and allows the Implementation Managers to take a proactive approach for tracking their orders and reviewing milestones. EzStatus provides direct access to critical areas of orders, e.g. Order Alerts, Pending, Activated and Cancelled work lists, Customizable Tracking, Order Grouping, Project Grouping, Spreadsheet Management and Customer Letters

Responsibilities:

Significantly contributed to optimizing the performance of Teradata SQL, ETL processes, and other operations.

Designed and crafted intricate mappings utilizing a variety of transformations such as lookup, expression, update, sequence generator, aggregator, router, and stored procedure to implement complex logic during mapping development.

Played a key role in requirement analysis, ETL design, and development, particularly in the extraction of data from mainframes.

Proficiently utilized Informatica Power Center 10.0.2, including designer, workflow manager, workflow monitor, and repository manager.

Configured Power Center to facilitate direct data loading into Hive, bypassing Informatica BDM for resource-efficient tasks.

Demonstrated expertise in understanding the various transformations supported by BDM for cluster execution, facilitating the creation of ETL low-level documents for developers.

Successfully implemented both classic Golden Gate replication and integrated replication.

Orchestrated Golden Gate setups to replicate data and maintain database synchronization until cutover.

Developed and maintained ETL mappings for extracting data from multiple source systems, including Oracle, SQL Server, and flat files, and loading it into Oracle.

Employed advanced T-SQL features to design and fine-tune T-SQL code for efficient interaction with the database and other applications, including the creation of business logic using T-SQL stored procedures.

Designed, developed, and rigorously tested stored procedures, views, and complex queries for report distribution.

Created database triggers and stored procedures using T-SQL cursors and tables.

Developed Informatica workflows and sessions associated with mappings using the workflow manager.

Extensively utilized Perl scripts for XML file editing and line count calculations tailored to client requirements.

Possessed excellent working knowledge of Informatica EIC/EDC, IDQ, IDE, Informatica Analyst Tool, Informatica Metadata Manager, and Informatica Administration/Server Administration.

Played a pivotal role in creating new table structures and modifying existing ones to align with the existing data model.

Responsible for testing, modifying, debugging, documenting, and implementing Informatica mappings.

Proficiently understood the various transformations supported by BDM for cluster execution, facilitating the creation of ETL low-level documents for developers.

Worked with pre-session and post-session UNIX scripts to automate ETL jobs and participated in the migration/conversion of ETL processes from development to production.

Skillfully converted numerous Teradata syntax macros to Greenplum/Postgres syntax functions using SQL & PL/PGSQL.

Extracted data from diverse sources, including Oracle, external systems (flat files), SQL Server, AWS, and transformed it to meet business requirements before loading.

Utilized DTS/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging areas and ultimately to data marts, including XML operations.

Engaged in the debugging of Informatica mappings and tested stored procedures and functions.

Developed mapplets, reusable transformations, source and target definitions, and mappings using Informatica 10.0.2.

Actively participated in the performance tuning of Informatica mappings.

Executed the migration of ETL workflows, mappings, and sessions to the QA and Production environments.

Demonstrated a strong grasp of source-to-target data mapping and the associated business rules in ETL processes.

Environment: informatica 10.0.2, PostgreSQL, Perl scripting, BDM, Unix, flat files (delimited), Data privacy management, Cobol files, Teradata 12.0/13.0/14.0

Client: Zoetis Parsippany, New Jersey Aug 2019 – Dec 2019

Role: Informatica Developer

Description: Zoetis is a global animal health company dedicated to supporting customers and their businesses in ever better ways. Building on more than 65 years of experience, we deliver quality medicines, vaccines, and diagnostic products, complemented by genetic tests, biodevices and a range of services. We are working every day to better understand and address the real-world challenges faced by those who raise and care for animals in ways they find truly relevant.

Responsibilities:

Provided strategic leadership for planning, developing, and implementing projects supporting the Enterprise Data Governance Office and Data Quality Team.

Designed and developed scripts and jobs in EDQ to perform data profiling on various data fields within source schema tables, generating statistics for different business scenarios and reporting findings to the Business Team.

Created scripts and jobs in EDQ to conduct data profiling on Data Fields from source schema tables, including reference tables for migration from source to target systems.

Developed reusable Data Quality mappings, maplets, DQ rules, and conducted data profiling using Informatica Developer and Informatica Analyst tools. Proficiently utilized Informatica EIC and AXON Data Governance, in addition to collaborating with Informatica Metadata Manager for Impact and Lineage analysis.

Utilized the split domain functionality for BDM and EDC, leveraging the Blaze engine on the Cluster.

Extensively employed Informatica Data Quality (IDQ) to profile project source data, validate metadata definitions, cleanse data, and identify duplicate or redundant records, offering guidance for backend ETL processes.

Loaded data into Teradata tables using Teradata Utilities such as BTEQ, Fast Load, Multi Load, Fast Export, and TPT.

Partnered with data stewards to deliver summary results of data quality analysis, aiding in the establishment of data quality metrics and business rule measurements.

Implemented data quality processes, including translation, parsing, analysis, standardization, and enrichment, supporting both point of entry and batch modes. Deployed mappings for scheduled, batch, or real-time data quality checks.

Collaborated with business and technical teams to gather data quality rule requirements, proposing optimizations when applicable, and designing and developing these rules using IDQ.

Designed and developed custom objects, rules, reference data tables, and created/imported/exported mappings as needed.

Conducted thorough data profiling based on business requirements, utilizing multiple usage patterns, root cause analysis, and data cleaning. Developed scorecards using Informatica Data Quality (IDQ).

Created 'matching' plans, selected matching algorithms, configured identity matching, and performed duplicate analysis.

Designed and built human task workflows for data stewardship and exception processing.

Developed Key Performance Indicators (KPIs) and Key Risk Indicators (KRIs) for the data quality function.

Drove improvements to maximize the value of data quality, including changes to access required metadata to enhance the impact and cost-effectiveness of data quality efforts.

Conducted data quality activities, including rule creation, edit checks, issue identification, root cause analysis, value case analysis, and remediation.

Ensured that data quality milestones were achievable and measurable, effectively communicating them for successful delivery.

Prepared detailed documentation for all mappings, maplets, and rules, delivering comprehensive documentation to the customer.

Environment: Informatica Developer 10.1.1 HotFix1, Informatica MDM Informatica EDC 10.2, Informatica Axon 6.1/6.2, Teradata 14.0, Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), UNIX/LINUX, Shell Scripting, Rational Rose/Jazz, SharePoint

Client: Blue Cross Blue Shield Richardson, TX May 2019- Aug 2019

Role: Informatica Developer

Description: Blue Cross Blue Shield Association is a federation of 36 separate United States health insurance organizations and companies, providing health insurance in the United States to more than 106 million people

Responsibilities:

Collaborated closely with Business Analysts to facilitate requirements gathering, conduct comprehensive business analysis, and contribute to the design of the Enterprise Data Warehouse.

Played a pivotal role in requirement analysis, ETL design, and development, focusing on the extraction of data from diverse source systems such as DB2, MS SQL Server, Oracle, flat files, Mainframes, and loading it into Staging and Star Schema structures.

Assumed responsibility for creating and modifying Business requirement documentation.

Developed ETL Specifications using GAP Analysis techniques.

Participated in Production Support, actively contributing on a monthly rotation basis.

Defined key components, including Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages, and query groups.

Constructed INFORMATICA mappings, Sessions, and Workflows, and engaged in end-to-end projects involving Informatica EIC/EDC & AXON.

Generated Unit Test cases and provided support for System Integration Testing (SIT) and User Acceptance Testing (UAT).

Conducted extensive data profiling using IDQ (Informatica Data Quality) Analyst Tool before data staging.

Utilized IDQ's standardized plans for cleaning up addresses and names.

Applied data quality rules and performed data profiling on source and target tables using IDQ.

Leveraged Informatica PowerExchange and Informatica Cloud to seamlessly integrate and load data into Oracle databases.

Worked with Informatica Cloud to create source and target objects, developing source-to-target mappings.

Demonstrated expertise in Dimensional Data Modeling, employing Star and Snowflake Schema techniques, and utilized Erwin for designing data models.

Engaged in Dimensional modeling, specifically Star Schema design, for the Data Warehouse, and used Erwin to define business processes, dimensions, and measured facts.

Employed Teradata Database for writing BTEQ queries and Loading Utilities, making use of Multiload, Fast Load, and Fast Export.

Took the initiative to create CISM tickets with HP to address and resolve DB2 issues.

Implemented enhancements in the MORIS (Trucks) data mart load process.

Logged defects in ALM (Application Lifecycle Management) software and assigned them to developers.

Conducted fine-tuning of complex SQL queries by analyzing Explain Plans to optimize performance.

Environment: Informatica Power Center 9.6.1, Axon, Informatica Metadata Manager, DB2 9.8, Rapid SQL 8.2, IBM Data Studio, Erwin, PLSQL, Red hat Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.

Client: 3i Infotech, Bangalore, India Jul 2015- Jul 2018

Role: Informatica Cloud Developer

Description: Citi works tirelessly to provide consumers, corporations, governments, and institutions with a broad range of financial services and products. We strive to create the best outcomes for our clients and customers with financial ingenuity that leads to solutions that are simple, creative, and responsible

Responsibilities:

Offer recommendations and expert insights into data integration and ETL methodologies.

Assumed responsibility for analyzing UltiPro Interfaces and crafting design specifications and technical specifications based on functional requirements and analysis.

Oversaw the end-to-end process of designing, developing, testing, and implementing ETL processes using Informatica Cloud.

Designed and executed ETL and Master Data Management (MDM) processes utilizing Informatica Power Center and Power Exchange 9.1.

Translated extraction logic from various database technologies such as Oracle, SQL Server, DB2, and AWS.

Produced comprehensive technical documentation to ensure thorough system documentation.

Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor components.

Managed Master Data Management (MDM) Hub Console configurations, including Stage Process configuration and Load Process Configuration, as well as Hierarchy Manager Configuration.

Conducted data cleansing procedures prior to loading data into the target system.

Transformed specifications into programs and data mappings within the Informatica Cloud ETL environment.

Delivered both High-Level and Low-Level Architecture Solutions/Designs as required.

Analyzed complex T-SQL Queries and Stored Procedures to identify elements suitable for conversion to Informatica Cloud AWS.

Developed reusable integration components, transformation processes, and logging mechanisms.

Provided support for system and user acceptance testing activities, actively resolving any issues that arose during testing.

Environment: Informatica Cloud, T SQL, Stored Procedure, AWS, FileZilla, Windows, bat scripting, MDM.

Client: Atom Technologies, Mumbai India Jan 2015 – May 2015

Role: ETL Developer

Responsibilities:

Engaged in System Analysis and Design, actively participating in user requirements gathering. Employed data mining techniques to analyze data from various perspectives and condense it into valuable information.

Took part in requirement analysis, ETL design, and development tasks, specializing in the extraction of data from diverse source systems such as MS SQL Server, Oracle, and flat files, and loading it into both Staging and Star Schema structures.

Utilized SQL Assistant for querying Teradata tables.

Demonstrated familiarity with various phases of the System Development Life Cycle (SDLC) and its methodologies, including Waterfall, Spiral, and Agile.

Leveraged Teradata Data Mover for copying data and objects, including tables and statistics, between different systems.

Contributed to the analysis and construction of Teradata Enterprise Data Warehouses (EDW) using Teradata ETL utilities and Informatica.

Implemented Change Data Capture (CDC) to handle Slowly Changing Dimensions (SCD).

Employed Change Data Capture (CDC) with CHKSUM to address data changes in cases where flag or date columns were absent to represent modified rows.

Utilized reusable code, referred to as Tie-outs, to maintain data consistency. This process involved comparing source and target data after completing the ETL loading to validate the absence of data loss during the ETL process.

Conducted data Extraction, Transformation, and Loading (ETL) activities using BTEQ, Fast Load, and Multiload.

Efficiently harnessed Teradata's Fast Load and Multiload utilities for loading data into tables.

Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow

Client: Wisdom Technologies Pvt Ltd. Hyderabad, India Jul 2014- Nov 2014

Role: Programmer Analyst

Responsibilities:

Played a pivotal role in gathering and analyzing requirements for data marts, with a primary focus on data analysis, data quality, and mapping data between staging tables and data warehouses/data marts.

Conceptualized and developed processes to address data quality concerns and detect and resolve errors effectively.

Crafted SQL scripts to validate and rectify inconsistent data in the staging database before proceeding with data loading into databases.

Conducted detailed analysis of session logs, bad files, and error tables to troubleshoot mappings and sessions.

Created shell script utilities to identify error conditions during production loads and take necessary corrective actions. Additionally, developed scripts for repository and log file backup/restore operations.

Proficiently scheduled workflows using Autosys job plans, ensuring efficient task execution.

Provided invaluable production support, actively participating in root cause analysis, bug fixing, and timely communication with business users regarding day-to-day production issues.

Environment: Informatica Power Center 8.6.1/7.x, Oracle 10g, Autosys, Erwin 4.5, CMS, TOAD 9.0, SQL, PL/SQL, UNIX, SQL Loader, MS SQL Server 2008.



Contact this candidate