Post Job Free

Resume

Sign in

Power Center Data Warehouse

Location:
Bellevue, WA
Posted:
August 09, 2023

Contact this candidate

Resume:

CHAITANYA

Email: adytb5@r.postjobfree.com:adytb5@r.postjobfree.com

Phone: +1-425-***-****

Professional Summary:

Extensive 10 + years of experience in Information Technology including Data Warehouse development using ETL/Informatica Power Center,Informatica Intelligent Cloud services(IICS) and Snowflake cloud database

Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.

Extensive experience in developing ETL mappings, scripts and data integration using Informatica Power Center.

Created Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center and Informatica Intelligent Cloud Services (IICS).

Experience working with Informatica IICS tools effectively using it for data Integration and data migration from multiple source systems.

Experience working with IICS concepts relating to data integration, Monitor, Administration, deployments and schedules.

Creating the Kafka Connector Connection in IICS for Streaming Ingestion Task.

Using Kafka Connector Loading in snowflake using Snow pipe process in snowflake

Used that data and process some transformations in IICS

Strong experience in migrating other databases to Snowflake

Strong skills in using Informatica Power center to build ETL mappings, sessions and workflows. (Informatica 10.x.).

Worked in the snowflake code migration process using Snow Pipe. Knowledge on Snowflake Multi - Cluster warehouses, virtual warehouses, SnowSQL and Snow Pipe.

Exposure on using Snowflake Database, Schema and Table structures.

Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.

Proficient using query tools like TOAD, SQL Developer, PL/SQL developer and Teradata SQL Assistant.

Extracting the data from various databases and flat file types like JSON, XML, CSV, Excel etc. worked on the migration of data from on premise storage to the cloud using Aws S3 and Snowflake.

Working experience in Informatica Power Center in the all stages of design, development and implementation of data mappings, mapplets, sessions using Informatica Power Center, Oracle, SQL and UNIX.

Strong working experience in all phases of development including Data Marts using IICS Informatica Cloud and Power Center.

Experienced in data visualization by using Tableau to create graphs, charts and dashboards

Working on the SQL tuning and optimization of the Business Objects reports.

Extensively tested Business Objects reports, dashboards and other Business Objects Applications in new environment

Extensively worked with SQL and PL/SQL.

Good experience with Snowflake Cloud Data warehouse, AWS S3.

Strong experience in Data Warehouse design, build, implementation and Data Migration.

Have solid experience writing SQL queries and experience with one or more database platforms and appliances such as Netezza, SQL Server, Oracle and DB2.

Extensively worked with SQL and/SQL.

Good experience with Snowflake utility SnowSQL.

Experience working with PII data in banking domain.

Experience in doing performance Tuning of Data Base queries.

Extensive experience in writing Complex SQL Queries, Stored Procedures, Views, Functions, Triggers, Indexes and Exception Handling using MS-SQL Server (TSQL)

Experience in UNIX environment, file transfers and job scheduling.

Experience working with databases queries, stored procedures, views (SQL Server preferred).

Strong in writing complex SQL Queries, functions, views in Oracle PL/SQL & SQL (9i/10g/11g) and used Database utility tools like SQL Navigator, SQL Plus, and SQL Developer.

Worked on Informatica Performance Tuning identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Experience in Data Warehousing, SAP Data Migration, Data Cleansing.

Created staging database to perform various data migration operations such as data profiling, data conversion and verification.

Involved in Unit testing, System testing to check whether the data loads into target are accurate.

Experienced in ETL Informatica & Database Upgrade Testing.

Worked closely with MDM teams to understand their needs in terms of data for their landing tables.

Created various profiles using Informatica Data Explorer (IDE) &IDQ, from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.

Experience working in agile methodology and ability to manage change effectively.

Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.

Good understanding on control M for scheduling jobs.

Proficient in the Integration of various data sources with multiple relational databases like Oracle, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and DataMart.

Experience in using Automation Scheduling tools like control-M.

Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

Developed both one-time and real-time mappings using Power Center, Power Exchange.

Excellent communication, interpersonal skills, enthusiastic, self-starter, eager to meet challenges and quickly assimilate latest technologies concepts and ideas.

Technical Skills:

Operating Systems

Windows, Linux, HP-UX

ETL Tools

Informatica Power Center, Informatica Cloud IICS,Snowflake Cloud

Databases

Oracle, MS SQL Server, Teradata

Software

Excel, PowerPoint, Word

Programming tools

SQL, PL/SQL and Basic Knowledge on python

Methodologies

Agile, Waterfall

Education Details: BTech (EEE) Swarnandhra College of Engineering and Technology 2013

Client: PNC BANK, Pittsburg, USA

Company: CGI, India April 2021- July 2023

Role: ETL Informatica /IICS Developer

Responsibilities:

Understanding the Business rules and sourcing the data from multiple source systems using IICS

Worked on converting Informatica power Centre ETL Code to IICS using PC to Cloud Conversion services.

Worked on Macros variables for reusable function with Multiple Columns.

Developed ETL programs using Informatica to implement the business requirements.

Worked closely with business users to discuss the issues and requirements.

Created shell scripts to fine tune the ETL flow of the Informatica work flows.

Redesigned the Views in snowflake to increase the performance.

Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.

Created tables, views, secure views, user defined functions in Snowflake Cloud Data Warehouse.

Experience in working with AWS, Azure and Google data services

Collected and processed requirements for dashboards and data visualization (using Tableau) for reports based on the findings of data mining

Extensively tested Business Objects reports, dashboards and other Business Objects Applications in new environment.

Created complex mappings like SCD type1, type2 by using Informatica based on business requirements.

Good working knowledge of any ETL tool (Informatica).

Worked on EDW modules by retrieving the data from the different source systems databases like DB2, Oracle, Netezza, MS SQL Server, Flat files and then loading it into target databases of Netezza/ MS-SQL Server Database using IBM Data Stage parallel jobs.

Created staging database to perform various data migration operations such as data profiling, data conversion and verification.

Used Informatica file watch events to pole the FTP sites for the external mainframe files.

Migration from one database to anther data base using Informatica.

Worked closely with MDM teams to understand their needs in terms of data for their landing tables.

Working with data architects and DBA’S on creating logical and physical data models.

Worked on Informatica Performance Tuning identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Created Netezza SQL scripts to test the table loaded correctly.

Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning using Netezza Database.

Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.

Effectively worked on Onsite and Offshore work model.

Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Identified problems in existing production data and developed one time scripts to correct them.

Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: IICS/ Informatica Power Center, Oracle Developer, Oracle 11g, Mainframe, SQL Server 2012,TOAD, UNIX, HP Quality Center, Snowflake

Client: Fifth Third Bank, Cincinnati, USA

Company: SLK Software, India Jan 2020-Mar2021

Role: ETL Informatica developer

Responsibilities:

Development of ETL using Informatica8.6.

Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.

Worked on Oracle Databases, and Snowflakes

Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.

Understand change management and code migration processes in Informatica.

Worked closely with MDM teams to understand their needs in terms of data for their landing tables.

Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.

Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.

Created various tasks like Session, Command, Timer and Event wait.

Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.

Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.

Prepared SQL Queries to validate the data in both source and target databases.

Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.

Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.

Worked on Informatica Performance Tuning identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Created Test cases for the mappings developed and then created integration Testing Document.

Prepared the error handling document to maintain the error handling process.

Automated the Informatica jobs using UNIX shell scripting.

Closely worked with the reporting team to ensure that correct data is presented in the reports .

Environment: Informatica Power Center, Oracle, SQL, SFTP.

Client: Voya Financial, New York, USA Mar 2018 -Dec 2019

Company Name: Slk Software, India

Role: ETL Developer

Responsibilities:

Involved in design, development and maintenance of database for Data warehouse project.

Involved in Business Users Meetings to understand their requirements.

Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica7.X.

Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.

Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.

Worked extensively with the connected lookup Transformations using dynamic cache.

Worked with complex mappings having an average of 15transformations.

Created and scheduled Sessions, Jobs based on demand, run on time and run only once

Monitored Workflows and Sessions using Workflow Monitor.

Performed Unit testing, Integration testing and System testing of Informatica mappings

Coded PL/SQL scripts.

Wrote UNIX scripts, Perl scripts for the business needs.

Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process

Created Universes and generated reports on using Star Schema.

Environment: Informatica Power Center, Oracle, Mainframe, SFTP, SQL, PL/SQL, UNIX

Client: Scotia Bank, Canada

Company: Diyotta India pvt ltd, India July 2013 – Feb 2018

Role: Informatica Developer

Responsibilities:

Imported various Sources, Targets, and Transformations using Informatica Power Center Server Manager, Repository Manager and Designer.

Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.

Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.

Used heterogeneous files from Oracle, Flat files and SQL server as source and imported stored procedures from oracle for transformations.

Designed and coded maps, which extracted data from existing, source systems into the data warehouse.

Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.

Written PL/SQL procedures for processing business logic in the database. Tuned SQL queries for better performance.

Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.

Generated completion messages and status reports using Informatica Server manager.

Tuned ETL procedures and STAR schemas to optimize load and query Performance.

Starting Sessions and Batches and make Event based scheduling.

Managed migration in a multi-vendor supported Server and Database environments.

Environment: Informatica Power center, Oracle, Teradata, SFTP, Windows XP



Contact this candidate