Post Job Free

Resume

Sign in

etl informatica developer

Location:
Piscataway, NJ
Salary:
60
Posted:
February 23, 2021

Contact this candidate

Resume:

YUGESH RAVI

Sr ETL / Informatica Developer

Email: adkfms@r.postjobfree.com Contact: 609-***-****

Professional Experience:

Around 9 years of experience in Information Technology with a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center 9.x/8.x/7.x/6.x, Power Exchange, Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.

Experience in writing stored procedures and functions.

Experience in SQL tuning using Hints, Materialized Views.

Done integration of the AXON with Informatica EIC/EDC. Achieved Data Governance with Informatica AXON. Achieved Lineage for DQ and impact assessment via AXON.

Lead the Team in driving the development of IFRS Business Rule Development in EDQ

Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method

Tuned Complex SQL queries by looking Explain plan of the SQL.

Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Experience in UNIX shell programming.

Expertise in Teradata RDBMS using Fast load, Multiload, Tpump, Fast export, Multiload, Teradata SQL Assistant and BTEQ utilities.

Worked with Informatica Cloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snowflake Schema. Designed data models using Erwin.

Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.

Extensive knowledge in handling slowly changing dimensions (SCD) type 1/2/3.

Experience in using Informatica to populate the data into Teradata DWH.

Experience in understanding Fact Tables, Dimension Tables, Summary Tables

Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloud watch, Cloud Front, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.

Designed Architectural Diagrams for different applications before migrating into Amazon cloud for flexible, cost- effective, reliable, scalable, high-performance and secured.

Good knowledge in Azure Data Factory, SQL Server 2014 Management Studio

Migrated existing development from servers onto into Microsoft Azure, a cloud-based service.

Build servers using AWS: Importing volumes, launching EC2, creating security groups, auto-scaling, load balancers, Route 53, SES and SNS in the defined virtual private connection.

Creating alarms in Cloud Watch service for monitoring the server performance, CPU Utilization, disk usage etc.

Wrote adhoc data normalization jobs for new data ingested into Redshift.

Familiar and with advanced Amazon Redshift and MPP database concepts.

Moved the company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.

Technical Skills:

ETL

Informatica 9.6.1/9.x/8.x/7.x, Data Stage 7.5/6.0, SSIS

Data Governance

Informatica EIC / EDC 10.2 with AXON, AnlaytixDS

CLOUD SERVICES

amazon web services

Databases

Oracle 12c/11g/10g/9i/8i/7.x, DB2, Teradata v 13/v12/v2r5, SQL Server

Reporting Tools

Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X

Database Modelling

Erwin 4.0, Rational rose

Languages

SQL, COBOL, PL/SQL, JAVA, C

Web Tools

HTML, XML, JavaScript, Servlets, EJB

OS

Windows NT/2000/2003/7, UNIX, Linux, AIX

Education: Bachelor of Technology in Computer Science and Engineering.

Client: Verizon, Piscataway New Jersey Nov 2019- Till date

Role: Sr ETL/ Informatica Developer

Description: Application is used for tracking & getting the latest status of order records. The application provides a one stop shop for Implementation Managers to track their customer’s order progress from order submission to installation. This application provides the latest data for pending orders and allows the Implementation Managers to take a proactive approach for tracking their orders and reviewing milestones. EzStatus provides direct access to critical areas of orders, e.g. Order Alerts, Pending, Activated and Cancelled work lists, Customizable Tracking, Order Grouping, Project Grouping, Spreadsheet Management and Customer Letters

Responsibilities:

Extensively worked in performance tuning of terra data SQL, ETL and other process to optimize performance.

Designed and developed complex mappings by using lookup, expression, update, sequence generator, aggregator, router, stored procedure, etc., transformations to implement complex logics while coding a mapping.

Worked in requirement analysis, ETL design and development for extracting data from the mainframes

Worked with Informatica Power Center 10 .0.2 designer, workflow manager, workflow monitor and repository manager.

Developed and maintained ETL (extract, transformation, and loading) mappings to extract the data from multiple source systems like oracle, SQL server and flat files and loaded into oracle.

Developed Informatica workflows and sessions associated with the mappings using workflow manager.

Used power exchange to read source data from Mainframes Systems, power center for etl and Db2 as targets

worked and created files for business objects

excellent working knowledge with Informatica Informatica EIC/EDC, IDQ, IDE, Informatica Analyst Tool, Informatica Metada manager, Informatica Administration/Server Administration.

Involved in creating new table structures and modifying existing tables and fit into the existing data model.

Responsible for normalizing COBOL files using normalize transformation

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Debug through session logs and fix issues utilizing database for efficient transformation of data.

Worked with pre-session and post-session unix scripts for automation of etl jobs. Also involved in migration/conversion of etl processes from development to production.

Extracted data from different databases like oracle and external source systems like flat files using etl tool.

Extracted data from sources like sql server, AWS and fixed width and delimited flat files. transformed the data according to the business requirement and then loaded

Involved in debugging informatica mappings, testing of stored procedures and functions,

Developed mapplets, reusable transformations, source and target definitions, mappings using Informatica 10.0.2.

Generated queries using sql to check for consistency of the data in the tables and to update the tables as per the business requirements.

Involved in performance tuning of mappings in informatica.

Migrated the ETL - Workflows, Mappings and sessions to QA and Production Environment

Good understanding of source to target data mapping and business rules associated with the ETL process.

Environment: informatica 10.0.2, Unix, flat files (delimited), Cobol files, Teradata 12.0/13.0/14.0

Client: Zoetis Parsippany, New Jersey June 2019 – Oct 2019

Role: Informatica Developer

Description: Zoetis is a global animal health company dedicated to supporting customers and their businesses in ever better ways. Building on more than 65 years of experience, we deliver quality medicines, vaccines, and diagnostic products, complemented by genetic tests, biodevices and a range of services. We are working every day to better understand and address the real-world challenges faced by those who raise and care for animals in ways they find truly relevant.

Responsibilities:

Provided strategic direction for the planning, development, and implementation of various projects in support of enterprise data governance office and data quality team.

Designed & Developed Scripts Jobs in EDQ to carry out the profiling of data for different data fields on source schema tables for different business scenarios and report the statistics to the Business Team.

Designed & Developed Scripts Jobs in EDQ to carry out profiling of data on different Data Fields from source schema tables along with the concerned reference tables meant for migration purpose from source to target systems.

Develop reusable Data Quality mappings and map lets, DQ Rules, Data Profiling Using Informatica Developer/ Informatica Analyst tool. Have expert level experience with Informatica EIC, AXON Data Governance. Work with Informatica Metadata manager for getting Impact and Lineage.

Ownership of the development of the 'Integrated Control Framework' (ICF) for data quality

Extensively used Informatica Data Quality (IDQ) to profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the data.

Ran checks for duplicate or redundant records and provided the information on how to proceed backend ETL process.

Loaded data into the Teradata tables using Teradata Utilities BTEQ. Fast Load, Multi Load, and Fast Export, TPT.

Partnered with data stewards to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data.

Implemented data quality processes including translation, parsing, analysis, standardization, and enrichment at point of entry and batch modes. Deploy mappings that will run in a scheduled, batch or real-time environment

Collaborated with various business and technical teams to gather requirements around data quality rules and propose the optimization of these rules if applicable, then design and develop these rules with IDQ

Worked on the designing and development of custom objects and rules, reference data tables and create/import/export mappings

As per business requirements, performed thorough data profiling with multiple usage patterns, root cause analysis and data cleaning and develop scorecards utilizing informatica data quality (IDQ)

Developed 'matching' plans and helped determine best matching algorithm, configure identity matching and analyze duplicates

Worked on building human task workflows to implement data stewardship and exception processing

Developed KPIs & KRIs for the data quality function

Drove improvements to maximize value of data quality (e.g. drive changes to have access to required metadata to maximize impact of data quality & quantity cost of poor data quality)

Performed data quality activities i.e. data quality rule creation, edit checks, identification of issues, root cause analysis, value case analysis and remediation

Augmented and executed the data quality to ensure achievable and measurable millstones are mapped out for delivery and are effectively communicated

Prepared the documents on all mappings, mapplets and rules in detail and handover documentation to the customer

Environment: Informatica Developer 10.1.1 HotFix1, Informatica EDC 10.2, Informatica Axon 6.1/6.2, Teradata 14.0, Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), UNIX/LINUX, Shell Scripting, Rational Rose/Jazz, SharePoint

Client: Blue Cross Blue Shield Richardson, TX Aug 2018- May 2019

Role: Informatica Developer

Description: Blue Cross Blue Shield Association is a federation of 36 separate United States health insurance organizations and companies, providing health insurance in the United States to more than 106 million people

Responsibilities:

Collaborated with Business Analysts for requirements gathering, business analysis and designing of the Enterprise Data warehouse.

worked in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like DB2, MS SQL Server, Oracle, flat files, Mainframes and loading into Staging and Star Schema.

Created/Modified Business requirement documentation.

Created ETL Specifications using GAP Analysis.

Did Production Support on Monthly rotation basis.

Worked on Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Used Informatica Data Quality (IDQ) to profile the data and apply rules to Membership and Provider subject areas to get Master Data Management (MDM)

Created INFORMATICA mappings, Sessions and Workflows. Worked END- END project with Informatica EIC/EDC & AXON.

Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.

Extracted Mainframe data for Local Landscapes like Credit Check, Application etc.

Created Unit Test cases, supported SIT and UAT.

Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

Used IDQ's standardized plans for addresses and names clean ups.

Applied the rules and profiled the source and target table's data using IDQ.

Worked with Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.

Worked with Informatica Cloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snowflake Schema. Designed data models using Erwin

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts

Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fast load, Fast Export.

Created CISM ticket with HP to fix DB2 issues.

Modified MORIS (Trucks) data mart load with enhancements.

Created Defects in ALM software and assigned to developers.

Fine Tuned the Complex SQL by checking Explain Plan of the queries.

Environment: Informatica Power Center 9.6.1, Axon, Informatica Metadata Manager, DB2 9.8, Rapid SQL 8.2, IBM Data Studio, Erwin, PLSQL, Red hat Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.

Client: Tata Consultancy Services, Bangalore, India Jun 2015- Mar 2017

Role: Informatica Cloud Developer

Description: Citi works tirelessly to provide consumers, corporations, governments, and institutions with a broad range of financial services and products. We strive to create the best outcomes for our clients and customers with financial ingenuity that leads to solutions that are simple, creative, and responsible

Responsibilities:

Provide recommendations and expertise of data integration and ETL methodologies.

Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis.

Design, Development, Testing and Implementation of ETL processes using Informatica Cloud

Developing ETL / Master data management (MDM) processes using Informatica Power Center and Power Exchange 9.1

Convert extraction logic from data base technologies like Oracle, SQL server, DB2, AWS.

Complete technical documentation to ensure system is fully documented

Data cleansing prior to loading it into the target system

Convert specifications to programs and data mapping in an ETL Informatica Cloud environment

Provide High Level and Low-Level Architecture Solution/Design as needed

Dig deep into complex T-SQL Query and Stored Procedure to identify items that could be converted to Informatica Cloud AWS

Develop reusable integration components, transformation, logging processes.

Support system and user acceptance testing activities, including issue resolution.

Environment: Informatica Cloud, T SQL, Stored Procedure, AWS, FileZilla, Windows, bat scripting, MDM.

Client: Atom Technologies, Mumbai India Nov 2014 – May 2015

Role: ETL Developer

Responsibilities:

Involved in System Analysis and Designing, participated in user requirements gathering. Worked on Data mining to analyzing data from different perspective and summarizing into useful information.

Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, flat files and loading into Staging and Star Schema.

Used SQL Assistant to querying Teradata tables.

Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Waterfall, Spiral, Agile.

Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.

involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

Worked on Change Data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.

Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.

Cleanse, standardize and enrich customer information using Informatica MDM.

Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.

Used the Teradata fast load/Multiload utilities to load data into tables.

Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow

Client: Wisdom Technologies Pvt Ltd. Hyderabad, India Mar 2012- Oct 2014

Role: Programmer Analyst

Responsibilities:

Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between staging tables and data warehouses/data marts.

Designed and developed processes to support data quality issues and detection and resolutions of error conditions.

Wrote SQL scripts to validate and correct inconsistent data in the staging database before loading data into databases.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.

Scheduled workflows using Autosys job plan.

Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.

Environment: Informatica Power Center 8.6.1/7.x, Oracle 10g, Autosys, Erwin 4.5, CMS, TOAD 9.0, SQL, PL/SQL, UNIX, SQL Loader, MS SQL Server 2008.



Contact this candidate