Personal E-Mail : *************.****@*****.***
Company E-mail: ***********.****@*******.***
Mobile No: +91-950*******
Certification:
1) Informatica Cloud Data Warehouse & Cloud Data Lake
2) Informatica Cloud Data Quality & Data Governance
3) Informatica Cloud Data Catalog & Data Integration & Mass Ingestion
Professional Summary 9.7 Years
Total Number Of Years
Work Environment
9.1 Years
As a ETL developer in Data warehousing using
a) Informatica Power Center
b) Informatica Power Exchange
c) Informatica Intelligent Cloud Services (IICS) / Informatica
Intelligent Data Management Cloud (IDMC)
d) Asend.io
6 Months
Experience in Production support Environment in Informatica Power Center.
Skills Profile
Technical
Core Skills
Data warehousing :
a) Informatica Power Center
b) Informatica Power Exchange
c) Informatica Intelligent Cloud Services (IICS)/ Informatica
Intelligent Data Management Cloud (IDMC)
d) Ascend.io
Operating System
Unix/Windows
Environment
Client/Server
Database
Oracle with Sql and PL/SQL, Teradata. Hadoop with HIVE
Other
Control M, Tode with Oracle to see schemas, and for Queries
Functional
Management and Leadership Skills
Ability For New Product Development.
Training & Development
Trained fresher’s in various technologies such as Informatica, Teradata.
Education
1)M.Tech Master of Technology in Communication Engineering from Vellore Institute of Technology, Vellore in 2004 with an aggregate of 7.2 C.G.P.A.
2)B.Tech Bachelor of Technology in Electrical & Electronic Engineering discipline from Jawaharlal Nehru Technological University, Hyderabad, in 2002.
Professional Experience
Project 1: Company : https://www.hcltech.com/
Project : Data Integration
Client : Zimvie
Role : Data Integration Lead
Duration : March 2023 – Till Today
Team Size : 10
Description :
In the year 2020 Zimvie is a global life science leader in the dental and spine markets that develops, manufactures and delivers a comprehensive portfolio of products portfolio of products and solutions designed to treat a wide range of spine pathologies and support dental tooth replacement and resolution .
Responsibilities:
Using Ascend.io tool extracted the data from Google Cloud Storage (GCS) and loaded the data into data lake systems (Raw Layer) .
Using IICS tool, extracted all the workflows, mapping names, parameters, and the entire source qualifier Queries using Assurance Package and involved in lineage Activities.
Designed ETL Solutions using Informatica Power Center tool by understanding the Client Requirements.
Created Business requirement analysis document (BRD) and Worked closely with Team to identify and validate the Data requirement from various Source systems.
Mapped the source data to the target data structures which are designed for the data
Warehouse using Standard Staging area.
Identified and implemented many transformations or conversions required for Consistency and usability of the data.
Created Informatica ETL mappings using different transformations like Source Qualifier, Lookup, and union, Joiner, Aggregator, Lookup, Expression, Router, and Update Strategy etc.
Created Workflows, Sessions and Command Tasks.
Created Source to stage batch jobs and Stage to Dimension batch jobs .
Involved in creating Shell Scripts and File Validation Scripts in UNIX.
Prepared Unit Test Cases and involved in validating the data.
Worked closely with team for gathering requirement.
Worked closely with UAT Team to monitor and execute either business rules are meeting the requirements of SLA.
Using service now created multiple incidents/problem Tickets and given resolutions to the clients .
Using Basic Teradata Query (BTEQ), Teradata M-Load External Loader,
Teradata Fast Load External Loader we populated data into Target Table.
Environment: IICS/IDMC, Informatica Power center 10 version, UNIX with Putty, Toad with Oracle, and Teradata.
Project 2: Company : https://www.mphasis.com/
Project : PBNS
Client : Schwab Bank
Role : Lead
Duration : March 2021 – March 2023
Team Size : 10
Description :
In the year 2020 Charles Schwab and TD Ameritrade become one combined company dedicated to serving investors in every phase of their financial journey, and Schwab announces plan to integrate the award-winning think or swim and think pipes trading platforms, education and tools into its trader offerings for retail and independent advisor clients in Software field .
Responsibilities:
Using Ascend.io tool extracted the data from Google Cloud Storage (GCS) and loaded the data into data lake systems (Raw Layer) .
Using IICS tool, extracted all the workflows, mapping names, parameters, and the entire source qualifier Queries using Assurance Package and involved in lineage Activities.
Designed ETL Solutions using Informatica Power Center tool by understanding the Client Requirements.
Created Business requirement analysis document (BRD) and Worked closely with Team to identify and validate the Data requirement from various Source systems.
Mapped the source data to the target data structures which are designed for the data
Warehouse using Standard Staging area.
Identified and implemented many transformations or conversions required for Consistency and usability of the data.
Created Informatica ETL mappings using different transformations like Source Qualifier, Lookup, and union, Joiner, Aggregator, Lookup, Expression, Router, and Update Strategy etc.
Created Workflows, Sessions and Command Tasks.
Created Source to stage batch jobs and Stage to Dimension batch jobs .
Involved in creating Shell Scripts and File Validation Scripts in UNIX.
Prepared Unit Test Cases and involved in validating the data.
Worked closely with team for gathering requirement.
Worked closely with UAT Team to monitor and execute either business rules are meeting the requirements of SLA.
Using service now created multiple incidents/problem Tickets and given resolutions to the clients .
Using Basic Teradata Query (BTEQ), Teradata M-Load External Loader,
Teradata Fast Load External Loader we populated data into Target Table.
Environment: IICS/IDMC, Informatica Power center 10 version, UNIX with Putty, Toad with Oracle, and Teradata.
Project 3: Company : https://www.capgemini.com/
Client : MUFG Bank
Role : Senior Consultant
Duration : January 2019- February 2020
Team Size : 10
Description :
MUFG Bank, Ltd. is the largest bank in Japan. It was established on January 1, 2006, following the merger of the Bank of Tokyo-Mitsubishi, Ltd. and UFJ Bank Ltd. The bank serves as the core retail, corporate, and investment banking arm of the Mitsubishi UFJ Financial Group.
Responsibilities:
Created Source to stage batch jobs and Stage to Dimension batch jobs.
Involved with Team to identify and validate the Data requirement from various Source systems.
Mapped the source data to the target data structures which are designed for the data
Warehouse using Standard Staging area.
Identified and implemented many transformations or conversions required for Consistency and usability of the data.
Created Informatica ETL mappings using different transformations like Source Qualifier, Lookup, union, Joiner, Aggregator, Lookup, Expression, Router, and Update Strategy etc.
Created Workflows, Sessions and Command Tasks.
Involved in resolving the Tickets with the help of ITSO Team.
Involved in creating Shell Scripts and File Validation Scripts in UNIX.
Prepared Unit Test Cases and involved in validating the data.
Using Control M created and scheduled all the jobs as per dependency.
Worked closely with team for gathering requirement.
Using Basic Teradata Query (BTEQ), Teradata M-Load External Loader,
Teradata Fast Load External Loader we populated data into Target Table.
Environment: Informatica Power center 9.6 version, UNIX with Putty, Toad with Oracle, and Teradata.
Project 4: Company : https://www.pentaconsulting.com/middle-east/ RIYADH
Client : Alrajhi Bank
Role : Senior Consultant
Duration : August 2017- November 2018
Team Size : 15
Description :
This bank is a major investor in Saudi Arabia's business and is one of the largest joint stock companies in Kingdom of Saudi Arabia, with over SAR 330.5 billion in AUM ($88 billion) and over 600 branches. Its head office is located in Riyadh, with six regional offices. Al Rajhi Bank also has branches in Kuwait and Jordan, and a subsidiary in Malaysia.
Responsibilities:
Worked closely with Team to identify and validate the Data requirement from various Source systems.
Mapped the source data to the target data structures which are designed for the data
Warehouse using Standard Staging area.
Identified and implemented many transformations or conversions required for Consistency and usability of the data.
Created Informatica ETL mappings using different transformations like Source Qualifier, Lookup, union, Joiner, Aggregator, Lookup, Expression, Router, and Update Strategy etc.
Created Workflows, Sessions and Command Tasks.
Created Source to stage batch jobs and Stage to Dimension jobs.
Involved in creating Shell Scripts and File Validation Scripts in UNIX.
Prepared Unit Test Cases and involved in validating the data.
Using Control M created and scheduled all the jobs as per dependency.
Using Informatica Power Exchange imported Mainframe files and loaded the data as per client requirement.
Worked closely with team for gathering requirement.
Involved in resolving the Tickets with the help of ITSO Team.
Using Basic Teradata Query (BTEQ), Teradata M-Load External Loader,
Teradata Fast Load External Loader we populated data into Target Table.
Environment: Informatica Power center 9.6 version, UNIX with Putty, Toad with Oracle, and Teradata.
Project 5: Company : https://www.ibm.com/in-en
Client : Barclays Bank (IBM)
Role : Data Specialist
Duration : June 2016 - July 2017
Team Size : 34
Description :
Barclays Bank has more than 325 years of history and expertise in banking. Barclays Group operates in 50 countries across Europe, Asia, Africa, the Middle East and America, and employs more than 130,000 people worldwide. Barclays has been active in Asia and the Pacific region since 1968. The bank announced in May 2017 that it would sell 1.5 billion pounds worth of shares of its Barclays Africa Group subsidiary as part of its strategy to refocus its business from Africa to the UK and US.
Responsibilities:
Created Informatica ETL mappings using different transformations like Source Qualifier, Joiner, Aggregator, Lookup, Expression, Router, Update Strategy, filter, union, Sequence Generator,Stroed Procedure,SQL, Java,Joiner,XML Generator, Transaction Control .
As a lead, involved in creating ETL solutions using Informatica tool by understanding the Client Requirements.
Worked closely with Onsite Team to identify and validate Data required from various Source systems.
Mapped the source data to the target data structures designed for the data Warehouse using Standard Staging area.
Identified and implemented many transformations or conversions required for Consistency and usability of the data using SQL and PL/SQL.
Created Workflows, Sessions and Command Tasks.
Created Source to stage batch and Stage to Dimension and Fact Batch.
Environment: Informatica Power center 9 version, UNIX with Putty, Toad with Oracle, Teradata.
Project 6: Company : https://www.capgemini.com/
Client : CNA
Role : Associate Consultant
Duration : January 2014- June 2016
Team Size : 3
Description :
CNA is into Serving businesses and professionals since 1897, CNA is the country’s eighth largest commercial insurance writer and the 14th largest property and casualty company. CNA’s insurance products include standard commercial lines, specialty lines, surety, marine and other property and casualty coverage’s. CNA's services include risk management, information services, underwriting, risk control and claims administration. "CNA" is a service mark registered by CNA Financial Corporation with the United States Patent and Trademark Office. Certain CNA Financial Corporation subsidiaries use the "CNA" service mark in connection with insurance underwriting and claims activities.
Responsibilities:
Identified and implemented many transformations or conversions required for Consistency and stability of the data.
Worked closely with Onsite Team to identify and validate the Data requirement from various Source systems.
Mapped the source data to the target data structures which are designed for the data
Warehouse using Standard Staging area.
Created Informatica ETL mappings using different transformations like Source Qualifier, Lookup, union, Joiner, Aggregator, Lookup, Expression, Router, and Update Strategy etc.
Created Workflows, Sessions and Command Tasks.
Created Source to stage batch and Stage to Dimension.
Involved in creating Shell Scripts and File Validation Scripts in UNIX.
Prepared Unit Test Cases and involved in validating the data.
Using Control M created and scheduled all the jobs as per dependency.
Using Informatica Power Exchange imported Mainframe files and loaded the data as per client requirement.
Worked closely with team for gathering requirement.
Involved in resolving the Tickets with the help of ITSO Team.
Using Basic Teradata Query (BTEQ), Teradata M-Load External Loader,
Teradata Fast Load External Loader we populated data into Target Table.
Environment: Informatica Power center 9.6 version, UNIX with Putty, Toad with Oracle, and Teradata.
Worked as Trainer in Infitech Global from August 2004 to December 2013.
Linked in ID :
www.linkedin.com/in/syed-zaheer-basha-55522b156
Bringingasfffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffsd its total network in the country to