Post Job Free

Resume

Sign in

Informatica Developer Etl

Location:
Austin, TX
Salary:
$65/hr
Posted:
August 18, 2021

Contact this candidate

Resume:

Aarthika K

Sr ETL / Informatica Developer

Email: adn9a9@r.postjobfree.com Contact: 469-***-****

Professional Experience:

Around 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing (OLAP) and ETL process using InformaticaPower Center 9.x/8.x/7.x/6.x, Power Exchange, Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQLServer and Worked on integrating data from XML files, flat files like fixed width and delimited.

Experience in writing stored procedures and functions.

Experience in SQL tuning using Hints, Materialized Views.

Done integration of the AXON with Informatica EIC/EDC. Achieved Data Governance with Informatica AXON. Achieved Lineage for DQ and impact assessment via AXON.

Lead the Team in driving the development of IFRS Business Rule Development in EDQ

Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method

Tuned Complex SQL queries by looking Explain plan of the SQL.

Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Experience in UNIX shell programming.

Relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation

Expertise in TeradataRDBMS using Fastload, Multiload, Tpump, Fast export, Multiload, Teradata SQLAssistant and BTEQ utilities.

Worked with Informatica Cloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snowflake Schema. Designed data models using Erwin.

Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.

Created customized BDM mappings/workflows for incremental using Informatica Developer and deploy them as part of an application on Data Integration service for the native execution or push down using the Blaze or Spark execution engines.

Assisted developers in debugging since the logs are scattered on BDM between the Native servers and on the Yarn - Spark and Blaze have different set of logs in different directories configured which is a pre-requisite for Informatica Big Data Products.

Extensive experience with T-SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views, User Profiles, Data Dictionaries and Data Integrity.

Excellent T-SQL development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, triggers, user defined functions.

Extensive knowledge in handling slowly changing dimensions (SCD) type 1/2/3.

Experience in using Informatica to populate the data into Teradata DWH.

Experience in understanding Fact Tables, Dimension Tables, Summary Tables

Experience on AWScloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloudwatch, Cloud Front, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.

Good knowledge on informatica B2B and BDM

Designed Architectural Diagrams for different applications before migrating into Amazon cloud for flexible, cost- effective, reliable, scalable, high-performance and secured.

Good knowledge in Azure Data Factory, SQL Server 2014Management Studio

Migrated existing development from servers onto into MicrosoftAzure, a cloud-based service.

Build servers using AWS: Importing volumes, launching EC2, creating security groups, auto-scaling, load balancers, Route 53, SES and SNS in the defined virtual private connection.

Establish and administer a process for receiving, documenting, tracking, investigating, and taking action on all complaints concerning the organization’s privacy policies and procedures in coordination and collaboration with other similar functions and, when necessary, legal counseling

Creating alarms in Cloud Watch service for monitoring the server performance, CPU Utilization, disk usage etc.

Wrote adhoc data normalization jobs for new data ingested into Redshift.

Familiar and with advanced Amazon Redshift and MPP database concepts.

Moved the company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.

Technical Skills:

ETL

Informatica 9.6.1/9.x/8.x/7.x, Data Stage 7.5/6.0, SSIS

Data Governance

Informatica EIC / EDC 10.2 with AXON, AnlaytixDS

CLOUD SERVICES

amazon web services

Databases

Oracle 12c/11g/10g/9i/8i/7.x, DB2, Teradata v 13/v12/v2r5, SQL Server

Reporting Tools

Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X

Database Modelling

Erwin 4.0, Rational rose

Languages

SQL, COBOL, PL/SQL, JAVA, C

Web Tools

HTML, XML, JavaScript, Servlets, EJB

OS

Windows NT/2000/2003/7, UNIX, Linux, AIX

Education: Bachelor’s in Electronics & Communication Engineering.

Client: USAA, Sanantonio, TX June 2020- Till date

Role: Sr ETL/ Informatica Developer

Responsibilities:

Extensively worked in performance tuning of terra data SQL,ETL and other process to optimize performance.

Designed and developed complex mappings by using lookup, expression, update, sequence generator, aggregator, router, stored procedure, etc., transformations to implement complex logics while coding a mapping.

Worked in requirement analysis, ETL design and development for extracting data from the mainframes

Worked with InformaticaPower Center 10 .0.2 designer, workflow manager, workflow monitor and repository manager.

Have configured Power center to load the data to Hive directly without Informatica BDM for less resource intensive jobs

Used the PC reuse utility to understand the BDM Compatibility to reuse some of the PC ETL Code onto the Informatica Developer.

Convert the Power Center code using the informatica developer to BDM Mappings.

Understand the different transformations supported by the BDM to run on the cluster before making any ETL Low level documents for the developers.

Developed and maintained ETL (extract, transformation, and loading) mappings to extract the data from multiple source systems like oracle, SQL server and flat files and loaded into oracle.

Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database and other applications in the most efficient manner and created stored Procedures for the business logic using T-SQL.

Designed, developed, and tested stored procedures, views and complex queries for Reports distribution.

Developed database triggers and stored procedures using T-SQL cursors and tables.

Developed Informatica workflows and sessions associated with the mappings using workflow manager.

Used power exchange to read source data from Mainframes Systems, power center for etl and Db2 as targets

worked and created files for business objects

Extensively used Perl scripts to edit the xml files and calculate line count according to the client's need.

Define and build best practices regarding creating business rules within the Informatica MDM solution.

Build out best practices regarding data staging data cleansing and data transformation routines within the Informatica MDM solution.

Implement Informatica MDM workflow including data profiling configuration specification and coding match rules tuning migration.

Continuously measure data privacy compliance with multifactor risk scoring and monitoring of data access and movement using data privacy management

Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.

Cleanse, standardize and enrich customer information using InformaticaMDM.

Configured match rule set property by enabling search by rules in MDM according to Business Rules

Excellent working knowledge with Informatica EIC/EDC, IDQ, IDE, InformaticaAnalyst Tool, Informatica Metadatamanager, InformaticaAdministration/Server Administration.

Involved in creating new table structures and modifying existing tables and fit into the existing data model.

Responsible for normalizing COBOL files using normalize transformation

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Debug through session logs and fix issues utilizing database for efficient transformation of data.

Convert the Power Center code using the informatica developer to BDM Mappings.

Understand the different transformations supported by the BDM to run on the cluster before making any ETL Low level documents for the developers.

Worked with pre-session and post-session UNIX scripts for automation of ETL jobs. Also involved in migration/conversion of ETL processes from development to production.

Migrated the ETL - Workflows, Mappings and sessions to QA and Production Environment

Good understanding of source to target data mapping and business rules associated with the ETLprocess.

Environment: informatica 10.0.2, Informatica MDMPostgreSQL,Perl scripting, BDM,Unix, flat files (delimited), Data privacy management, Cobol files, Teradata 12.0/13.0/14.0

Client: Blue Cross Blue Shield Richardson, TX Aug 2018- May 2020

Role: Informatica Developer

Responsibilities:

Collaborated with Business Analysts for requirements gathering, business analysis and designing of the Enterprise Data warehouse.

worked in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like DB2, MS SQL Server, Oracle, flat files, Mainframes and loading into Staging and Star Schema.

Created/Modified Business requirement documentation.

Created ETL Specifications using GAP Analysis.

Did Production Support on Monthly rotation basis.

Worked on Master DataManagement concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Worked towards data modelling, baseline the data model to accommodate all business specific requirements and configured Informatica MDM Hub Server, Lookup Adapter, Cleanse Server, Cleanse Adaptor and Resource Kit.

Experience in Match and Merge Analysis, Trust Validations, Meta data Management.

Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages, and query groups.

Created Mappings to get the data loaded into the Staging tables during the Stage Process.

Coordinated with Business Leads to understand Match & Merge and incorporated their requirements and ideas.

Used Informatica Data Quality (IDQ) to profile the data and apply rules to Membership and Provider subject areas to get Master Data Management (MDM)

Publishing Data using Message Queues to notify external applications on data change in MDM Hub Base Objects.

Created INFORMATICA mappings, Sessions and Workflows. Worked END- END project with InformaticaEIC/EDC & AXON.

Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.

Extracted Mainframe data for Local Landscapes like Credit Check, Application etc.

Created Unit Test cases, supported SIT and UAT.

Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

Used IDQ's standardized plans for addresses and names clean ups.

Applied the rules and profiled the source and target table's data using IDQ.

Worked with InformaticaPower exchange and Informatica Cloud to integrate and load data to Oracle db.

Worked with InformaticaCloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snowflake Schema. Designed data models using Erwin

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts

Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fast load,Fast Export.

Created CISM ticket with HP to fix DB2 issues.

Modified MORIS (Trucks) data mart load with enhancements.

Created Defects in ALM software and assigned to developers.

Fine Tuned the Complex SQL by checking Explain Plan of the queries.

Environment: Informatica Power Center 9.6.1, Axon, Informatica Metadata Manager, DB2 9.8, Rapid SQL 8.2, IBM Data Studio, Erwin, PLSQL, Red hat Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.

Client: Navient – OpSys, CTS India Jun 2015- Dec 2017

Responsibilities:

Provide recommendations and expertise of data integration and ETL methodologies.

Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis.

Design, Development, Testing and Implementation of ETL processes using Informatica Cloud

Developing ETL / Master data management (MDM) processes using InformaticaPower Center and Power Exchange 9.1

Convert extraction logic from data base technologies like Oracle, SQL server, DB2, AWS.

Complete technical documentation to ensure system is fully documented

Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.

Involved with Master Data Management (MDM) for Customer Data Integration using Siperian tool.

Handled Mater Data Management (MDM) Hub Console configurations like Stage Process configuration, Load Process Configuration, Hierarchy Manager Configuration.

Data cleansing prior to loading it into the target system

Convert specifications to programs and data mapping in an ETL Informatica Cloud environment

Provide High Level and Low-Level Architecture Solution/Design as needed

Dig deep into complex T-SQL Query and Stored Procedure to identify items that could be converted to InformaticaCloud AWS

Develop reusable integration components, transformation, logging processes.

Support system and user acceptance testing activities, including issue resolution.

Environment: Informatica Cloud, T SQL, Stored Procedure, AWS, FileZilla, Windows, bat scripting, MDM.

Client: Mattel OneStore, CTS India Nov 2014 – May 2015

Role: ETL Developer

Responsibilities:

Involved in System Analysis and Designing, participated in user requirements gathering.Worked on Data mining to analyzing data from different perspective and summarizing into useful information.

Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, flat files and loading into Staging and Star Schema.

Used SQL Assistant to querying Teradata tables.

Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Waterfall,Spiral,Agile.

Used TeradataData Mover to copy data and objects such as tables and statistics from one system to another.

involved in Analyzing / building TeradataEDW using Teradata ETL utilities and Informatica.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

Worked on Change Data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.

Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.

Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.

Used the Teradata fast load/Multiload utilities to load data into tables.

Environment:Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow

Client: Igate Global Solutions,. Hyderabad, India Mar 2012- Oct 2014

Role: Programmer Analyst

Responsibilities:

Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between staging tables and datawarehouses/data marts.

Designed and developed processes to support data quality issues and detection and resolutions of error conditions.

Wrote SQL scripts to validate and correct inconsistent data in the staging database before loading data into databases.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.

Scheduled workflows using Autosys job plan.

Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.

Environment: Informatica Power Center 8.6.1/7.x, Oracle 10g, Autosys, Erwin 4.5, CMS, TOAD 9.0, SQL, PL/SQL, UNIX, SQL Loader, MS SQL Server 2008.



Contact this candidate