Post Job Free

Resume

Sign in

ETL Developer

Location:
North Bergen, NJ
Salary:
$70/Hr
Posted:
February 13, 2024

Contact this candidate

Resume:

Sai Krishna

Sr. ETL / Informatica & IICS Consultant

Email: ad3lns@r.postjobfree.com

+1 (908) -308-6010

Professional Summary:

11+ years of experience in IT industry, related to various aspects involving Data integration and Data warehousing techniques, using ETL tools like Informatica Power Center 10.5.2/9.6/9.1/8.6, Informatica Power Exchange 10.5.2/9.6, Informatica Intelligent Cloud Services (IICS), CI/CD pipeline, Snowflake.

Experience integrating data to/from On - premises database and cloud-based database solutions using Informatica intelligent cloud services.

Worked extensively on ETL process using Informatica PowerCenter 10.x/9.x/8.x/7.x and basic knowledge in Informatica Data Quality (IDQ) 10.1/9.6.1, Informatica Power exchange.

Hands on IICS components, Data Integration, application Integration, monitor and administration; Data Profile, Analysis, data cleansing, Address Validation, fuzzy matching/ Merging, data conversion, exception handling on Informatica Data Quality 10.1 (IDQ) and Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses using Informatica PowerCenter.

Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.

Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.

Experienced Data Governance Specialist with a strong proficiency in utilizing the Collibra Data Governance Center (CDGC) platform to ensure data quality,

Experience working with cloud-based database solutions with AWS Redshift.

Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.

Experienced Data Quality Analyst with a proven track record of leveraging Customer Data Quality (CDQ) methodologies and tools to enhance data accuracy, consistency, and reliability. Skilled in conducting data quality assessments, implementing data cleansing strategies,

Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software, Utility, Pharmaceutical, Health Care, Insurance, Financial and Manufacturing industries.

Experience in development and maintenance of Oracle 11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD, SQL*LOADER.SQL, Stored procedures, functions, analytic functions, constraints, indexes, and triggers.

Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.

Data integration and migration specialist with over 5 years of extensive experience in leading the conversion of multiple legacy data systems from different sources into Oracle databases.

Experience in ETL development process using Informatica for Data Warehousing, Data migration and Production support.

Experience in both Waterfall and Agile SDLC methodologies.

Evaluate Snowflake Design considerations for any change in the application.

Experience in CI/CD pipeline for Informatica may involve scripting, integration with Informatica CLI tools, and custom automation scripts to perform tasks such as exporting and deploying Informatica objects. The goal is to automate the process of code integration, testing, and deployment for Informatica ETL assets, like how it's done with traditional software applications in a CI/CD pipeline.

Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.

Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) – Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.

Data modeling experience in creating Conceptual, Logical and Physical Data Models using Erwin Data Modeler.

Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.

Performed data profiling and analysis making use of Informatica Data Quality (IDQ).

Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.

Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture the changed data.

Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge on Teradata 12.0/13.0/14.0, SQL Server 2000/2005/2008/2012 and MS Access 2003/2007.

Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.

Technical proficiency in the Data Warehousing teamed with Requirement gathering, Data Analysis, Data Modeling, Business Requirements Analysis, Application Design, Development & testing, Data profiling, data standardization & Quality Control and full life cycle Implementation of Data Warehouse.

Experience in designing error and exception handling procedures to identify, record and report errors.

Extensive Experience in developing and performance tuning of informatica codes.

Worked extensively onsite (client location and having exposure of direct involvement with client). `

Designed and worked on mapping documents for input into ETL specifications by Identifying, analyzing, and profiling data sources.

Understand the business rules completely based on High Level document specifications and implement the data transformation methodologies.

Experience writing and modifying ETL design documentation, test results documentation and standard operating procedures (SOP) documentation.

Created Complex mappings using Connected and Unconnected Lookups, Aggregate, Stored Procedure and Java transformations for populating target table in efficient manner.

Experience in testing data using debugging mapping method. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations. Monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Used SQL tools to run SQL queries, tune and gather stats to view and validate the data loaded into the warehouse and troubleshoot data issues and enhance performance.

Used SQL tools to run SQL queries to view and validate the data loaded into the warehouse and troubleshoot data issues.

Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e., optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures, and functions).

Adept in formulating Test plans, Test cases, Test Scenarios, Test Approaches, Setting up Test Environments in conjunction with the Testing team and expertise in formulating test approaches with Load Testing, Integration Testing, Functional Testing, and User Acceptance Testing (UAT).

Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.

Extensive knowledge in all areas of Project Life Cycle Development.

Strong analytical, verbal, written and interpersonal skills.

Technical Skills:

ETL Tools

Informatica PowerCenter 10.5.2/9.X/8.x, Informatica Power Exchange 10.x/9.5, Informatica

Data Quality (IDQ) 10.1/9.6. informatica Data Transformation (v9x) IICS Cloud, data integration, Data Privacy Manager, Tableau, Power BI, Dell Boomi

Data Quality Tools

Informatica Analyst, Informatica Data Quality, Informatica Developer.

Data Modeling

MS Visio 2008, Erwin 9.5, ER/Studio 9.x/8.x/7. x.

Databases

Teradata 14/13/12, Oracle 12c/11g, MS SQL Server 2008, MS Access2012, DB2 v9.

Reporting Tools

Tableau 9.x/10.x, QlikView 11, BOXI 11R2

Tools

SQL plus, PL/SQL Developer, SQL* Loader, SQL server Management Studio, SQL server Query Analyzer, SQL server mail service, DBCC, BCP, SQL server profiler.

Add these (Fast load, Multi load, T pump, TPT)

Web Technologies

MS FrontPage, MS Outlook Express, FTP, TCP/IP.

Other Tools

Microsoft Office, Visual Basic 6, Apache Spark.

Scheduling Tools

Control-M, Autosys 11.3, Tidal, Windows Scheduler.

Languages

SQL, PL/SQL, Data Structures, Unix Shell Script, Transact- SQL, PL/SQL.

Operating Systems

Windows, Linux, Unix, MS-DOS,

Educational Qualification:

Bachelor of commerce & Computer from ANU, AP in 2005.

Professional Experience:

Elevance Health Nov2019 - Present

Richmond, Virginia.

Sr. Informatica & IICS Consultant

Responsibilities:

Worked with Informatica PowerCenter 10.5.2 in applying business rules and transforming data as per client requirement.

Handling Batch Failures and Daily Batch status reporting.

Worked on Slowly changing Dimensions Type1, Type2, Type3

Extracted Data from Mainframe Sources using Power exchange connectors.

Created various mappings using different transformations in LDO’s such as Union, Sorter, Joiner, Expression, Aggregator etc., transformations.

Conducted regular monitoring and maintenance activities to optimize performance and troubleshoot issues.

Created LDO’s (Logical Data objects), PDO’s (Physical data objects) and profiles for each JIRA stories.

Monitoring of loads belonging to different environments STG, MDS, ODS, MDE.

Root cause analysis and fix for failures, EPDSv2 and SPS Application.

Critical batch process monitoring and Problem Solving, Break Fixes and Permanent resolutions for repetitive production failures.

Provide daily communication of support issues to all Data Integration Team members.

Keep abreast of new jobs going to production by attending weekly ETL Implementation review meetings.

Daily Report Creation-Created daily status report for the jobs running in FIT and handling failures.

Effectively using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like SQL Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files.

Experience working with IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer, and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.

Experience working with Informatica Power exchange integration with IICS to read data from Condense Files and load into SQL Datawarehouse environment.

Led the design and implementation of data integration workflows using Informatica Intelligent Cloud Services.

Created scorecards in Informatica Analyst to validate the valid and invalid rows.

Gathering Business requirements and creating technical specifications along with creating Internal and External Design documents.

Experience working with IICS with FULL push Down optimization to push data from Staging to ODS at scale using data integration templates.

Experience working with Custom Built Query to load dimensions and Facts in IICS.

Create Technical specification documents and work with Business Analysts on analysis of multiple source systems data as part of ETL mapping exercise.

Experience working with Data integration concepts not limited to mapping, mapping configuration task, Task flows, deployment using GIT automation, schedules, connections, API integration.

Experience working with Key Range Partitioning in IICS, handling File loads with concept of File list option, creating fixed with file format and more, file listener and more.

Experience integrating data using IICS for reporting needs.

Conduct meetings with BSA team to understand the FRD (Functional Requirement Documents).

Maintain the quality in development and deliverables with automated and peer code reviews.

Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.

Responsible for Impact Analysis, upstream/downstream impacts.

Knowledge of FACETS healthcare application, Provider data and Vendor Data.5

Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.

Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.

Responsible for Technical Design Document and Details mapping document. Analyzing and understanding functional requirements.

Develop the mappings using needed Transformations in Informatica tool according to technical specifications.

Implemented Informatica error handling, dynamic parameter file generations and coding best practices.

Used Informatica reusability at various levels of development.

Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.

Worked on Profiling, Standardization, reference check, data cleaning and error login using IDQ.

Conceptualized and developed initial and incremental data loads, incremental aggregation and PDO.

Resolving issues raised by customers, Handling Issues during development and analyzing phase and Fixing Bugs if any. Worked on new change requests (CRs)in production based on priority level.

Environment: Informatica PowerCenter 10.5.2/10.2, Informatica Developer 10.1, Informatica Data Quality 10.1, Informatica Intelligent Cloud Services (IICS), Teradata, Snowflake, Control M, Oracle 11g/12c, SQL Developer, UNIX Shell Scripting, SQL Server 2008/2012, Autosys 11.3, File Zilla, Putty, SQL * Plus, Tableau, Hadoop, Toad for Oracle.

Asurion Corporation May 2017 - Oct 2019

Sterling, Virginia.

Lead - ETL / Informatica & IICS Developer

Responsibilities:

Developing Informatica ETL jobs flows to extract data from Source and to load into the DataMart.

Develop the mappings using needed Transformations in Informatica tool according to technical specifications.

Implemented Informatica error handling, dynamic parameter file generations and coding best practices.

Designed several Processes on Informatica Data Quality and exposed them as RESTful API services to publish data to external systems.

Monitoring of loads belonging to different environments PowerCenter, CDC, IICS

Proficient in designing and implementing data integration workflows using IICS.

Informatica server’s maintenance like memory, cache clean up, ETL code review and migration process.

Led the design and implementation of data integration workflows using Informatica Intelligent Cloud Services.

Collaborated with cross-functional teams to understand business requirements and translate them into effective data integration solutions.

Implemented security measures and ensured compliance with data protection regulations.

Implemented data quality checks and validation processes to enhance data integrity.

Developed mappings using various transformations like Source Qualifier, Aggregator, Lookup (Connected/Unconnected), Joiner, Update Strategy, Sorter, Router and Expression.

Developed incremental insert/update process (SCD 1 and SCD 2).

Implementation of Ingestion process using Redshift database

Implemented end to end monthly cubes load processes.

Implementation and execution of test cases and test results as a part of data validation

ETL uses Informatica Intelligent cloud services (IICS) to load data into AWS-REDSHIFT cloud.

Worked on staging the data into worktables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.

Worked on Teradata queries, macros, utilities like MLOAD and FLOAD. Tuned the queries and procedures.

Designed and Developed Real-time and Batch job using Informatica Power Exchange to read from Table and file.

Created Power Exchange Registration, Data Map, restart token and worked on recovery and restart process.

Supported production environment on L2 level and resolved many issues on timely basis.

Involved in Developing UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.

Environment: Informatica PowerCenter 10.0/9.6, Power Exchange 9.6, Informatica Data Quality 9.6.1, Informatica Intelligent Cloud Services (IICS), Oracle 11g, Erwin, VSAM, DB2 mainframe, Flat files, Fast Load, Multi load, Tpump, Bteq, T-SQL, Teradata 13, Toad, QlikView 11, Control-M, SQL server, SQL* Plus, TOAD, SQL*Loader, Toad for Oracle, Tableau, Unix Shell scripts.

Anthem Jun 2015 – Feb 2017

Richmond, Virginia.

Senior Informatica Developer & Administrator

Responsibilities:

Extract, transform and Load data from different sources into Oracle database using Informatica as ETL Informatica PowerCenter tool.

Install and configure Informatica PowerCenter or other Informatica products as needed.

Set up and configure repositories, integration services, and other components.

Manage user accounts and permissions within the Informatica environment.

Experienced in providing system administration and operational support in Unix environment.

Managing domain, Node, Service configuration, and Repository Backups

Implemented High availability across PROD/DEV/UAT environments.

Fixed platform related issues like Informatica services going down.

Fixed ETL Workflows which are going to wait status and long running.

Created and configured Power center repository services and Power center integration services

Configured Model repository, data integration and content management services for IDQ.

Migrating code to different environments using XML & Deployment groups

Have designed history logic and written good SQLs in Teradata.

Have been involved in solving complications in the production level issues.

Worked extensively in shell scripting and click stream analysis.

Handled offshore for related work in respect of unit test case preparation and documentation like implementation, warranty & support.

Handling Batch Failures and Daily Batch status reporting.

Root cause analysis and fix for failures.

Critical batch process monitoring and Problem Solving.

Break Fixes and Permanent resolutions for repetitive production failures.

workflow monitor to create, monitor workflow in case of process failures. Performed testing as per UTC.

Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse.

Written UNIX shell scripts to automate manual tasks.

Documented the process for further maintenance and production support.

Coordinated the Change Management process which involved driving the QA and production deployments.

Created Mapplets and Work lets using set of transformations for Reusability.

Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.

Responsible for Support and Issue Resolutions using Session Logs.

Used Workflow manager to create and configure workflow and session task to load data. Used Informatica workflow monitor to create, monitor workflow in case of process failures. Performed testing as per UTC.

Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse.

Environment: Informatica PowerCenter 9.5.1, Informatica Power Exchange 9.5.1, Source Analyzer, UNIX, Windows XP), Netezza, SQL Server, Teradata 12, SQL, PL/SQL, PVCS, ERWIN.

Wellpoint Sep 2014 – May 2015

Plano, Texas.

ETL/ Informatica Developer

Responsibilities:

Develop the mappings using needed Transformations in Informatica tool according to technical specifications.

Responsible for technical Design Document and Details mapping document

Knowledge in state submission claims file format and business logics used in different states.

Knowledge about the relation between tables and state business information.

Used transformations like Lookup, Joiner, Update Strategy, Aggregators, Filters, Expression, Router, Source Qualifier, Union, and Sequence Generator.

Extensive use of Informatica Power Center tools -- Source Analyzer, Target Designer, Mapping Designer, Workflow Manager and Workflow Monitor.

Participated in Unit Testing and prepared Unit Test Cases, Unit Test Logs as a part of it.

Responsible for team learning and knowledge sharing sessions.

Migration of code to QA, Pre-prod, and Production Environment.

Provided Process Improvement and existing code optimization methods.

Interacting with onsite people and resolving the offshore issues.

Developing Informatica ETL jobs to extract data from Source and to load into the Oracle.

Implemented Informatica error handling, dynamic parameter file generations and coding best practices.

Responsible for creating Source to Target data mapping sheet and data modeling.

Used Informatica reusability at various levels of development.

Attended weekly status meetings and provided detailed status reports to the client.

Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.

Involved in Developing UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.

Preparing Test Results, Unit testing, Defect Fixing.

Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level

Handled various loads like Intra Day Loads, Daily Loads, Weekly Loads, Monthly Loads, and Quarterly Loads using Incremental Loading Technique.

Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor

Designed, built, and maintained Business Objects reports. Created Master detail, Charts and Crosstabs reports using functionality of BOs like Breaks, Ranks, Alerted, inserted Calculations, Filters, Sorts, and query prompts to view data in an analyzed way.

Environment: Informatica PowerCenter 8.6, Informatica Power Exchange 8.6, Oracle 9i, SQL, PL/SQL, Shell Scripts, Teradata, Windows XP, UNIX, Informatica Scheduler, Business Objects.

Pepsi Co Dec 2013 – Aug 2014

Plano, Texas.

ETL/Informatica Developer

Responsibilities:

Developing Informatica ETL jobs to extract data from Source and to load into the Oracle.

Develop the mappings using needed Transformations in Informatica tool according to technical specifications.

Responsible for technical Design Document and Details mapping document.

Extensively used Informatica Power Center and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.

Used Informatica reusability at various levels of development.

Developed ETL code by the business logic defined in the Functional Design Document.

Break Fixes and Permanent resolutions for repetitive production failures.

Provide daily communication of support issues to all Data Integration Team members.

Keep abreast of new jobs going to production by attending weekly ETL Implementation review meetings.

Daily Report Creation-Created daily status report for the jobs running in FIT and handling failures.

Performed data manipulations using various Informatica Transformations like Expression, Aggregate, Filter, Update Strategy, and Sequence Generator etc.

Created sessions and workflows for designed mappings.

Performed testing and creating test cases for new code changes (ETL & SP).

Worked closely with application teams to prepare STM docs to implement the changes.

Worked with creation of reports for decision support.

Involved in analysis of the data in the development and test environment.

Work with analog team to test the existing application.

Involved in logging defects in JIRA and monitor the progress until it goes to UAT environment.

Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.

Preparing Test Results, Unit testing, Defect Fixing.

Environment: Informatica PowerCenter 9.1, Oracle 10g/9i/8i, Unix, Mainframe, MS Access, SQL/PL/SQL, MySQL. Sybase, DB2, Oracle, Teradata and Autosys, HP Quality Centre, SM

GE Apr2013 – Nov. 2013

Detroit Metropolitan Area.

ETL Developer

Responsibilities:

Involved in business analysis and technical design sessions with business and technical staff to develop.

requirements document and ETL specifications.

Analyzed the all the documents related to technical and functional.

Involved extracting the sources from Oracle, flat file sources.

Developed extensively the mappings according to the design document.

Tested all the possible business rules available and session properties available in Informatica.

Used Mapping Parameters and session parameters.

Used Target Load Order for populating various targets within the same mapping.

Implemented Slowly Changing Dimension Type 2 – Current Flag to maintain historical information.

Developed Unit tests and separate User Acceptance test cases (UAT) to validate data against test scenarios.

Involved in Scheduling the Workflows to follow the order of loading the data to target tables.

Migrating Informatica mappings from Development to QA.

Involved in designing dimensional modeling and data modeling using Erwin tool.

Created high-level Technical Design Document and Unit Test Plans.

Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.

Experience in Performance tuning & Optimization of SQL statements using SQL trace.

Involved in Unit, System integration, User Acceptance Testing of Mapping.

Supported the process steps under development, test, and production environment.

Environment: Informatica Power Center 8.6.4/7.1.4, Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case.



Contact this candidate