Post Job Free

Resume

Sign in

Etl Data

Location:
Wentzville, MO
Salary:
90000
Posted:
August 05, 2020

Contact this candidate

Resume:

Suprasanna Rath

+1-630-***-****

ade3ho@r.postjobfree.com

Professional Summary

Around 9 years of professional experience as software developer with experience in full life cycle software development - requirements analysis, design, coding, testing, documentation, and Implementation of Business Intelligence solutions using Data warehousing tools.

Extensive work experience in Informatica PowerCenter/DIH/IDQ,MDM,Spotfire,Tableau,Data Modelling and Oracle while executing assignments with various global customers.

Hands on experience in AXON and EDC configurations, bulk upload, creating and scanning the resources.

Well conversant with the Data Warehousing methodologies.

Experience in ETL operations to load data from different sources systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2 and Excel Files into Data warehouse and Data marts using Informatica Power Centre 10.4,10.4, 9.1/9.0.1.

Expertise in Tableau 8, Tableau 9, Tableau 10.5 and Spotfire 7 as reporting tool.

Have worked in Data Warehousing development, Enhancements and production support projects as well.

Pat on the back award for GE and Nissan projects and best project award.

My strengths include good communication skills, being an amicable team player as well as being capable of taking initiatives.

Technical Expertise

BI Tools

Informatica, Tableau,Spotfire,Oracle, Basics of Hadoop

Languages

SQL, PL/SQL,R

Tools and utilities

MS Office, WINSCP, HP Reflection, Command Editor, QMF etc.

Database

Oracle 10g/9i/8i, MS SQL Server 2008,SQL Developer, MS-ACCESS,,IBM DB2/UDB and Sybase

Operating Systems

Microsoft Win XP/Vista/7/8, Mac, Unix

Web Related

HTML, Java Technology

Domain Knowledge

Manufacturing, Automobile, Transportation

Educational Qualification

Qualifying Exam

Board Of Education

Year

Division/Class

B.Tech

(Electronics and Telecommunication Engg)

Biju Patnaik

University Of

Technology

2011

1st

Intermediate 10+2

CBSE

2007

1st

Secondary School, Class X

CBSE

2005

1st

Current Project Experience

MasterCard (O Fallon, MO)

Data Inventory (Senior Software engineer) (Aug ’19 – Till Date)

Metadata Management provides the ability to perform resource discovery, capture and organize resources, integrate, access, share, link and maintain metadata across an organization. The Metadata Management platform provides the ability to load metadata from almost all sources into a central metadata repository to enable information to be analysed, perform impact analysis, show data lineage, provide data relationships, structural metadata and support archiving and preservation of resources. Metadata assists in resource discovery by "allowing resources to be found by relevant criteria, identifying resources, bringing similar resources together, distinguishing dissimilar resources, and giving location information.

Data Governance plays a key role in metadata management for guidelines, strategies, policies and implementation.

Responsibilities

Managing a team and involving in providing knowledge transfers for new resources to understand the end-to-end environment and the process to adhere.

Lead and interacting with Business Analyst to understand the business requirements. Involved in analysing requirements to refine transformations.

Used IDQ tool for data profiling.

Used IDQ tool to transform business logics.

Involved in all phase of SDLC, i.e. Design, Development, Testing and Implementation of ETL processes end to end using Informatica PowerCenter and Informatica Cloud.

Responsible for projects estimates, design documents, resource utilization and allocations

Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.

Worked on shell scripting to trigger Informatica Workflows and for file management and FTP process.

Work closely with DBAs, application and ETL developers, change control management for migrating developed mappings to QA and PROD.

Support system and user acceptance testing activities, including issue resolution.

Integrating Informatica platform with Hadoop environment

Establishing the connection setup to consume Hadoop environment data for IDQ profiling and data quality assignments

Interact with Informatica vendor on the performance improvement in the IDQ platform

Understand the business needs on the Data quality profiling and providing solutions

End to End workflow creation to automate the Auto refresh process for metadata loads into EDC

Extracting the metadata from Master card applications and load it into EDC for metadata Management purposes

Involved in writing shell scripts to schedule, automate the IDQ jobs.

Involving in developing IDQ Applications jobs for ODI Sanctions migration project and changing existing code based on business needs.

Address issues/requests from IDQ Application teams with answers for queries.

Conduct reviews, identify gaps, and recommend and implement change to release and change management policies on a regular basis.

Documentation of all the efforts with root cause analysis and review.

Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.

Explore prebuilt ETL metadata, mappings, Develop and maintain SQL code as needed for Oracle.

Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.

Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.

Developed complex ETL jobs from various sources such as SQL server, Postgres sql and other files and loaded into target databases using Informatica ETL tool.

Designed the Informatica ETL flow to load the data into hive tables and create the Informatica jobs to load the data into Oracle and Hive tables.

Configured the hive tables to load the profitability system in Informatica ETL Repository and create the Hadoop connection for HDFS cluster in Informatica ETL repository.

Develop the ETL mappings for XML, CSV, TXT sources and loading the data from these sources into relational tables with Informatica ETL Developed Joblets for reusability and to improve performance.

Developed complex ETL mappings using Informatica to move data from our operational database to Datawarehouse.

Created staging tables to load the data in warehouse to perform ETL tasks using Informatica.

Created ETL pipelines in and out of data warehouse using Snowflake’s SnowSQL.

Connecting/Configuring to source snowflake targets.

Importing the snowflakes tables and creating mapping using the snowflake objects and different snowflakes datatypes.

Creating workflows in Snowflake Informatica PowerCenter environment and running the same in with help of schedulers as per the business requirements .

Nissan North America RMP (Franklin, TN) (Informatica developer Lead) (Mar ’16 - Aug ’19)

This is project is related to automate the regional marginal profitability report and customer experience report. These reports were made by excel sheet by the business. To reduce and increase productivity the reports were analysed, back tracked to different source systems and ETL is used to pull the data. After pulling the data. Reports are made on top of with the help of tableau for many visualizations. It involved end to end software development cycle.

Responsibilities

Requirement gathering and estimating the effort for the projects.

Have Created, modelled the design of Datawarehouse.

Creating document solution requirements, translate those requirements into technical specifications and work plans, and provide technical leadership around those work plans .

Have worked on performance improvement of the codes .

Extensively used Data Masking transformation to mask the sensitive data

Extensively used Various Data Cleansing and Data Conversion Functions like RTRIM, LTRIM, ISNULL, ISDATE, TO_DATE, Decode, Substr, Instr, ReplaceStr and IIF functions in Expression Transformation

Worked on creating Mapping/Session/Workflow Variables and Parameters.

Extensively used Push down Optimization to load data for performance.

Incremental extraction using MD5 function.

Created Informatica maps using various transformations like SAP BAPI/RFC, XML transformations (Generator & Parser), Web services consumer (Rest API), XML, HTTP transformation, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Union, Filter and Router.

Adhere to Informatica B2B Data Transformation, SDLC, CDW standards, policies, procedures and industry best practices with respect to system design, architecture, naming and coding standards and system development lifecycle steps.

Worked closely with business application teams, business analysts, data architects, database administrators and BIS reporting teams to ensure the ETL solution meets business requirements. Performance improvement changes to the existing code of Informatica, queries using explain plan. Well versed with database partitioning and Informatica mappings performance improvement using partitioning. Processed large volumes of data and large files.

Scheduling the Informatica workflows using Tivoli scheduling tool & trouble shooting the Informatica workflows.

Enhancing the existing mappings, workflows with new business requirements and Report the project status to the management.

Also worked as post go live support/on-call production support consultant to monitor the production support applications.

Developed project plans with team members to ensure Informatica solutions are executed and delivered as per the plans with good quality. Mentor the team members in the design and development of mappings & workflows with onsite/offshore execution model.

Worked with Static cache, Persistent, Dynamic cache, Index cache, Data cache and target-based commit interval in order to improve the performance at session level.

Involved in Performance Tuning of ETL process

Involved in tuning of SQL queries by using manual by EXPLAIN PLAN

Very good experience in debugging the Code.

Worked close with QA team to fix the bugs.

Providing estimations on the tasks and enhancement.

Designing the new set of tables and views in reporting (oracle) database

Creating mapping, mapplets, sessions and workflows in informatica PowerCenter tool.

Migrating the components into QA and production environment.

Working on minor changes as per the business.

Coordinating with offshore and work on job fail issues.

Performing Unit Testing.

Creating reports in tableau and support the same.

Creating adhoc reports in tableau as per needs.

Helping the Users during UAT and fixing the UAT issues.

Helped team as backup informatica MDM developer.

Environment: Tools & Utilities: Informatica 10.6, Oracle, Tableau,Informatica DIH

Nissan North America BIDW (Bhubaneshwar, India) Informatica Team Lead (Dec’13 – Mar ’16)

Project objective is to provide support and enhancement on the Tableau reports for Financial, Manufacturing, Supply Chain and Procurement Subject Area for GE Energy Gas Engines where Oracle EBS 11i is used as source. GE has spread his business almost worldwide and capturing his detail Sales/Transaction/Profit/Invoice details activity using Tableau reporting tool which is beneficial for the GE business users to browse through each of the Subject Area/Dashboard/Report to verify the sales effectiveness on different area. It involved in the full Flow of informatica and Tableau

Responsibilities

Requirement Gathering.

Analysis.

ETL Design in Informatica

Created dimensions and fact tables for the project.

Used update strategy and implemented SCD type-3 to effectively migrate data from source to target.

Moved the mappings from development environment to test environment and then to production environments.

Worked on lookup, aggregator and joiner caches for performance enhancements.

Involved in designing of Testing plan (Unit Testing and system testing)

Prepared Technical Design and Mapping specification design documents of the project.

Interact with the requirements team and designers to get a brief knowledge of business logics.

Review the Master Data Workbooks for better understanding of business logic.

Conducted Review sessions with SME’s and Business users for better understanding of the requirements.

Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor.

Setup Developer Access for Development, QA and Production Environments in Informatica

Set up Project Folders in all Environments.

Extract data from Flat files, Oracle and Netezza to load the data into the database.

Responsible for code enhancement in progress for Control Tower

Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.

Extracted data applied Data Masking rules before loading into target.

Incremental extraction using MD5 function

Extensively used various transformations like Filter, Router, Sequence Generator, Lookups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.

Extensively used Mapping Variables and Mapping Parameters for the capturing delta loads

Responsible for daily production data load activities and troubleshoot for solution if found any issue related to ETL.

Responsible for daily production incident resolution those are raised by user within proper SLAs

Involved in status call on weekly basis with Customer for the status update.

Unit Testing of the ETL Jobs

Checking the Quality and Performance

Enhancement in Spotfire reports.

Environment: Tools & Utilities: Informatica 10.4, Spotfire, Oracle

GE Transportation (Bhubaneshwar, India) Informatica ETL developer (Mar ’12 – Dec ’13)

Developing DWH by fetching data from ERP Systems and Building Cognos/OBIEE Reports on top of the DWH.

Parts DWH provides all Parts and Repairs data to the Business Users. It holds all detail level data of all turbine Parts being sold to the Customers by GE and also holds the Financial and Contract related data in the DWH and also gives the details to the Business Users.

Responsibilities

Requirement Gathering.

Analysis.

ETL Design in Informatica.

Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.

Explore prebuilt ETL metadata, mappings, Develop and maintain SQL code as needed for Oracle.

Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.

Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.

Developed complex ETL jobs from various sources such as SQL server, Postgres sql and other files and loaded into target databases using Informatica ETL tool.

Designed the Informatica ETL flow to load the data into hive tables and create the Informatica jobs to load the data into Oracle and Hive tables.

Configured the hive tables to load the profitability system in Informatica ETL Repository and create the Hadoop connection for HDFS cluster in Informatica ETL repository.

Develop the ETL mappings for XML, CSV, TXT sources and loading the data from these sources into relational tables with Informatica ETL Developed Joblets for reusability and to improve performance.

Developed complex ETL mappings using Informatica to move data from our operational database to Datawarehouse.

Created staging tables to load the data in warehouse to perform ETL tasks using Informatica.

Responsible for gathering the business requirements and converting the same into technical specifications.

Worked with Informatica power centre tools like Designer, Workflow Manager, Repository Manager, source analyser, warehouse designer, mapping & mapplet designer and Transformation developer.

Developed mappings using source qualifier, joiner, Lookups (connected and unconnected), expression, filter, router, and aggregator, sorter, update strategy and normalizer transformations.

Worked with Source qualifier override, lookup override for customizing the SQL queries.

Created pre and post session Stored procedures to drop, recreate the indexes and keys of source and target tables.

Implemented Type 1 and Type 2 Slowly changing dimensions for Dimension and Fact loads.

Responsible for performance tuning at Mapping level, Session level, Source and Target level.

Worked with Static cache, Persistent, Dynamic cache, Index cache, Data cache and target-based commit interval in order to improve the performance at session level.

Responsible for daily production data load activities and troubleshoot for solution if found any issue related to ETL.

Responsible for daily production incident resolution those are raised by user within proper SLAs

Involved in status call on weekly basis with Customer for the status update.

Unit Testing of the ETL Jobs

Checking the Quality and Performance

Enhancement in OBIEE,Spotfire reports.

Working with business on verifying the data on OBIEE reports. Doing data analysis on top of the reports .

Environment: Tools & Utilities: Informatica 10.2, Cognos, Talend, Tableau,OBIEE

Personal Details

Name

Suprasanna Rath

Phone Number

+1-630-***-****

Email Id

ade3ho@r.postjobfree.com

Passport No.

J3704037

Valid Upto: 18/04/2021

Type of Visa holding (if any)

H1B

Location

O Fallon, Missouri, USA



Contact this candidate