Post Job Free

Resume

Sign in

Data Quality Cloud Service

Location:
Austin, TX
Posted:
November 07, 2023

Contact this candidate

Resume:

RAKESH CHADA

Phone: 859-***-**** Email: ad0xfu@r.postjobfree.com

PROFESSIONAL SUMMARY: -

About 7+ year’s professional experience in IT industry and wide range of progressive experience in design, analysis, development, documentation, coding and implementation including but not limited to Databases, Reporting, Data Warehouse, ETL Design and BI Applications across wide industry verticals.

Good working Experience in Software Development Life Cycle (SDLC) methodologies like Waterfall and Agile.

Experienced in using ETL tools including Informatica Intelligent Cloud Service (IICS), PowerCenter 10.x/9.x/8.x, Power Exchange and Cloud Data quality.

Used various Informatica Cloud Service, Power Center and Data Quality transformations as - aggregator, source qualifier, update strategy, expression, joiner, lookup, router, sorter, filter, XML Parser, labeler, parser, address validator, match, merge, comparison and standardizer to perform various data loading and cleansing activities.

Experience in using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows and processing tasks using Informatica Designer / Workflow Manager.

Extensively used Application Integration(CAI) for REST API and SOAP API integrations.

Designed, Developed and Implemented ETL Processes using Data Integration(CDI)

Extensively worked on Application Integration(CAI) and Data Integration(CDI) in IICS.

Developed PowerShell script's which help in smooth flow of files with process in Cloud Application Integration (CAl) and Cloud Data Integration (CDI).

Extensively used IICS Schedules, Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.

Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes.

Worked on RestV2 Connector& Swagger File.

Extensively experienced in Delimited flat files and Fixed-width flat files.

Experience with relational and dimensional models using Facts and Dimensions tables.

Experience in Dimensional Modeling using Star schema and Snowflake schema.

Experienced in using advanced concepts of Informatica like Push Down Optimization (PDO), constraint-based load ordering and target load plan.

Designed and developed Informatica mappings including Type-I, Type-II and Type-III slowly changing dimensions (SCD).

Extensive experience in Salesforce, Salesforce Marketing Cloud (SFMC), Oracle (Eloqua), Netezza, SQL, SQL Server database.

Experienced in writing SQL, PL/SQL programming, Stored Procedures, Functions, Triggers, Views and Materialized Views.

Very good understanding of Versioning' and 'Deployment Groups' concepts in Informatica. Worked extensively with versioned objects and deployment groups.

Excellent communication and presentation skills works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.

TECHNICAL SKILLS:-

Data Warehousing/ETL: Informatica Intelligent Cloud Service (IICS), Informatica PowerCenter 10.x/9.x/8.x, Informatica Data Quality, Power Exchange.

Data Modeling: Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Datamart, OLAP, OLTP, Erwin and Oracle designer.

Databases & Tools: Oracle, Teradata, DB2, SQL Server, Netezza7.1, Snowflake, Salesforce, Salesforce Marketing Cloud (SFMC),

Scheduling Tools: Autosys, Tidal and Control-M.

Reporting Tools: Sisense, PowerBi.

Programming Languages: Unix Shell Scripting, Windows language, SQL and PL/SQL, C#.

Cloud Platforms: Informatica Intelligent Cloud Service (IICS), AWS, Azure, GCP Bigquery.

Methodology: Agile and Waterfall.

Environment: Windows, UNIX and LINUX.

PROFESSIONAL EXPERIENCE:

Client: American Equity Jun 2021-Till Date

Role: Informatica Cloud Developer (IICS)

Responsibilities:

Created IICS jobs to extract the data from flat file, database and transforming the data as per business logic and load the date in target systems such as flat file, Salesforce, Salesforce marketing cloud (SFMC), Oracle (Eloqua).

Developed the mappings using transformations in Informatica according to technical specifications.

Created complex mappings that involved implementation of Business Logic to load data into staging area.

Designed and developed custom Datawarehouse to support cross platform analytics.

Implemented middleware IICS applications for cloud data services.

Designed, Developed and Implemented ETL Processes using Data Integration(CDI).

Developed complex Informatica Cloud(CDI) taskflows (parallel) with multiple mapping tasks and taskflows.

Designed and developed Process jobs in Informatica Cloud Application Integration(CAI) to extract XML’s from AWS S3 Bucket.

Integrated Informatica Cloud Data Integration(CDI) Taskflow’s into Application Integration Process (CAI) jobs to invoke the API call on AWS S3 bucket for retrieving the real time XML’s and parse to a Fixed width file.

Exposed Informatica API’s for Real-time integrations and designed Service Connector, Process objects and IICS CAI process.

Developed Application integration services using REST, SOAP, and WSDL.

Created Multiple Task Flows For loading data from Different sources to CRM Dynamics using Microsoft Dynamics 365 connecter with Datasynchronization Task, Mapping Task and Used Bulk API, and Standard API as required.

Created BPEL process for Database integration and Swagger file was used part of API integration.

Created Process, Sub - Process and reusable process, reusable service connectors and reusable app connection to connect to DB2 Web services, to read File in Local and SFT Folders using File parsers and also to write to target.

Created and used reusable transformations to load data from operational data source (ODS) to Data Warehouse and involved in capacity planning and storage of data.

Worked on API/Web Services using application integration(CAI) REST End point connector.

Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics and insight use case for Sales team.

Created File Mass ingestion task for moving Data files from FTP, SFTP, Local Folders to FTP SFT, Local Folders and Data from Databases Mass ingestion task to load data from on-premise database to Snowflake.

Implemented Exception Handling Mappings by using IICS code to remove the Error records/ Invalid records.

Created various PL/SQL stored procedures, functions, views, Cursors, indexes on target tables.

Extracted data from various sources like Oracle, flat files and XML

Created various UNIX Scripts for pre/post session commands for automation of IICS jobs.

Implemented slowly changing dimension methodology for accessing the full history of accounts.

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Implemented Exception Handling Mappings by using IICS code to remove the Error records/ Invalid records.

Environment: Informatica Intelligent Cloud Service (IICS), Informatica Power center 10x, AWS S3 Bucket, Salesforce, Salesforce Marketing Cloud, Oracle (Eloqua), SQL Server 2016, PL/SQL, Agile, SQL, Erwin 4.5, Business Objects, Windows script, Flat files (Fixed width, Delimited).

Client: BlueShield of California, San Francisco, CA Jul 2019 – Jun 2021

Role: ETL/ Informatica Developer

Responsibilities:

Experience integrating data to/from On - premise database and cloud-based database solutions using Informatica intelligent cloud services(IICS).

Designed and developed Process jobs in Informatica Cloud Application Integration to extract data from Sql server.

Exposed Informatica API’s for Real-time integrations and designed Service Connector, Process objects and IIS process.

Developed new mapping designs using various tools in Informatica like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.

Developed the mappings using transformations in Informatica according to technical specifications.

Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.

Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.

Created mappings, Mapplets according to Business requirement using Informatica big data version and deployed them as applications and exported to power center for scheduling.

Design and develop and Implement detail layout of ETL testing plan procedures.

Design Audit Balance and Control (ABC) for ETL process.

Experienced with performance tuning for a workflow using pushdown optimization to send data from the source file and target side (both sides).

Provide estimations for ETL deliverables and oversee the progress for quality ETL Deliverables.

Designed and developed Informatica’ s Mappings and Sessions based on business user requirements and business rules to load data from source to target tables.

Worked on Informatica pushdown optimization. Also, worked on mapping / session / table level optimization etc.

Created various UNIX Shell Scripts for pre/post session commands for automation of loads using Tidal.

Implemented slowly changing dimension methodology for accessing the full history of accounts.

Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.

Environment: Informatica Cloud (IICS), Informatica Power Center 10.2.0, Oracle Xe, PL/SQL, Teradata, Flat files, Erwin, UNIX, Oracle SQL developer, Tidal.

Client: American Express, Phoenix, AZ June 2018 – May 2019

Role: ETL/ Informatica Developer

Responsibilities:

Designed various mappings and Mapplets using different Transformations Techniques such as Key Generator, Match, Labeler, Case Converter, Standardizer and Address Validator.

Developed new mapping designs using various tools in Informatica like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.

Used Facets application to open, add generations, enter and save information.

Extensively involved in Scheduling of Informatica workflows using Tidal Scheduler.

Developed modules to extract, process & transfer the customer data using Teradata utilities.

Developed the mappings using transformations in Informatica according to technical specifications.

Created complex mappings that involved implementation of Business Logic to load data in to staging area.

Wrote Python scripts to parse XML documents and load the data in database.

Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.

Involved in Database migrations from legacy systems, SQL server to Oracle and Netezza.

Involved in reviewing business requirements and analyzing data sources form Excel/Oracle SQL Server for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop.

Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.

Created mappings, Mapplets according to Business requirement using Informatica big data version and deployed them as applications and exported to power center for scheduling.

Created various UNIX Shell Scripts for pre/post session commands for automation of loads using Tidal.

Implemented slowly changing dimension methodology for accessing the full history of accounts.

Environment: Informatica Power Center 10.1, Oracle, Teradata V2R5, TOAD for Oracle, SQL Server 2016, PL/SQL, DB2, Netezza 7.1, Agile, Ruby, Python, SQL, Erwin 4.5, Business Objects, Unix Shell Scripting (PERL), Windows, Tidal and Autosys.

Client: Central California Alliance for Health (CCAH), Scotts Valley, CA June 2017 –May 2018

Role: ETL/ Informatica Developer

Responsibilities:

Interacted with Business Analyst to understand the business requirements and implement the same into a functional Data warehouse design.

Responsible for Developing Informatica development life cycle process documents.

Created various PL/SQL stored procedures, functions, views, Cursors,indexes on target tables

Extracted data from various sources like Oracle, flat files and XML.

Development of mappings as per the technical specifications approved by the client.

Developed mappings using Informatica PowerCenter Designer to transform and load the data from source systems to target database.

Created Mapplets to reduce the development time and complexity of mappings and better maintenance and Worked on different sources like Oracle, flat files.

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.

Involved in enhancements and maintenance activities of the data warehouse including performance tuning.

Used mapping variable for imparting flexible runs of workflows based on changing values.

Involved in the debugging of the mappings by creating break points to gain trouble shooting information about data and error conditions.

Responsible for testing and validating the Informatica mappings against the pre-defined ETL design standards.

Created sessions and workflows to run with the logic embedded in the mappings using PowerCenter Designer.

Environment: Informatica Power Center 10.1.1, Oracle 11g, PL/SQL, Teradata, Flat files, SQL Server 2016, Erwin, UNIX. Toad 9.0, Oracle SQL developer, Tableau 10.2, Tidal.

Client: Conduent, Ridgeland, MS Jun 2016 – May 2017

Role: ETL/ Informatica Developer

Responsibilities:

Gathered user Requirements and designed Source to Target data load specifications based on Business rules.

Used Informatica Power Center 10.1.1 for extraction, loading and transformation (ETL) of data in the datamart.

Designed and developed ETL Mappings to extract data from flat files, MS Excel and Oracle to load the data into the target database.

Developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.

Created jobs for automation of the Informatica Workflows and for DOS copy and moving of files using Tidal Scheduler.

Created complex mappings to load the data mart and monitored them. The mappings involved extensive

Extensively used ETL processes to load data from various source systems such as DB2, SQL Server and Flat Files, XML files into target system Teradata by applying business logic on Use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.

Transformation mapping for inserting and updating records when loaded.

Expertise in conversions from SQL Server to Teradata.

Performed operational support and maintenance of ETL bug fixes and defects.

Worked with Teradata utilities such as Mload, Fload, export etc. to load or extract tables.

Supported migration of ETL code from development to QA and QA to production environments.

Created various UNIX Shell Scripts for pre/post session commands for automation of loads using Tidal.

Designed and developed Unix Shell Scripts, FTP, sending files to source directory & managing session files.

Environment: Informatica Power Center 10.1.1, Oracle 11g, PL/SQL, Teradata, Flat files, SQL Server 2016, Erwin, UNIX. Toad 9.0, Oracle SQL developer, Tableau 10.2, Tidal.

Client: Birla Sun life Insurance, India Jul 2015 – May 2016

Role: ETL Informatica Developer

Responsibilities:

Worked in all phases of SDLC from requirement gathering, design, development, testing, training and rollout to the field user and support for production environment.

Worked in Environment that was following Agile Methodology.

Prepared the required application design documents based on functionality required

Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target Oracle Data Warehouse database.

Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.

Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.

Performance tuning of Informatica sessions for large data files has been done by increasing the Buffer block size, data cache size, and sequence buffer length.

Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data

Performed Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Involved in production support working with various mitigation tickets created while the users working to retrieve the database.

Environment: Informatica Power Center 9.0.1, Oracle 9i, TOAD, UNIX Shell Scripting, OBIEE, Oracle DAC

Education:

Bachelor’s in Electronics and Communication Engineering

JNTU, Hyderabad, India, GPA: 3.3.

Master’s in Information Technology

Campbellsville University, KY, GPA: 3.7.



Contact this candidate