Post Job Free

Resume

Sign in

Data Integration Warehousing

Location:
Plano, TX
Salary:
$55 per hour
Posted:
January 11, 2024

Contact this candidate

Resume:

SIRISHA NALABOTHU

Mobile : 314-***-**** Email : ad2oal@r.postjobfree.com

Summary:

•8+ years of Strong experience in all phases of Design, Development, Implementation, and support of Data warehousing applications using Informatica Power Center10.x,9.x& 8.x, Informatica Power Exchange 10.x, IICS Data Integration (CDI) and IICS Realtime Application Integration (CAI).

•Full life-cycle project delivery experience from product definition to implementation including requirements and specification writing, design documentation, unit and system testing, optimization and performance analysis and release into a production environment.

•Expertise knowledge in domains like Retail (Crop science), Health Care, Pharma Systems.

•Experience in IICS data integration components like Mapping, Transformations, Task and Task flow and Data synchronization, Replication and Dynamic Data task to integrate the data from various sources like Database (SQL Server, oracle, MySQL, HANA DB), Flat file, Salesforce, SAP to/from using informatica intelligent cloud services.

•Experience in IICS application integration components like processes, Service Connectors, Process objects, App Connectors and Email Configuration and Event based processes.

•Worked with standard and Application connectors like Salesforce, Hive, SAP Table Connector, Flat file, Database, Email, Rest V2, Salesforce OAuth2 connectors to integrating the data from different source systems to applications and applications to Databases.

•Experience in invoking the APIs and creating the jobs to perform operations like (GET, POST, DELETE and PUT) though CDI and CAI methods based on the authentications provided.

•Worked extensively on data extraction, data integration, and loading from different data sources using snowflake, Informatica and Teradata.

•Experience in using Azure board and storing Informatica code into GitHub Repository, Build and Release the pipelines to migrate the code from all phases (DEV, QA, and PROD) of Project.

•Experience in implementing the business rules by creating transformations like (Expression, Aggregate, Lookup, Router, Rank, Update Strategy) and developing Mapplets and Mappings. Worked with different data sources like Flat Files and Relational Databases. Good Knowledge of Data Warehousing Concepts.

•Strong in using workflow tasks like Session, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-session, and Pre/Post commands.

•Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using SCD (Type1/Type2/Type3) loads.

•Well acquainted with session partitioning and Performance Tuning of sources, targets, mapping, and sessions to overcome the bottlenecks in mappings. Trouble shooting and problem-solving applications with Informatica Debugger.

•Excellent experience in understanding and following coding standards, business norms and handover, and takeover of responsibilities.

•Experienced in Data modeling (Dimensional & Relational) Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.

•Experienced in data integration with informatica power center, power exchange and informatica cloud IICS with Salesforce, RMQ, SAP S4, relational DBs (SQL Server, oracle, MySQL, HANA DB), Flat files, and service central, Jira.

•Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.

•Experience in writing bat scripts using UNIX and windows commands.

Technical Expertise:

Data Warehousing/ETL: Informatica Power Center 9.X /10.x, Informatica Power Exchange, IICS (Data Integration (CDI), Application Integration (CAI))

Databases: Teradata, Microsoft SQL Server, Oracle 9i/10g/11g, My SQL, Snowflake, Hive.

Programming Languages: SQL, Batch scripts

IDE: SQL plus, PL/SQL Developer, Toad, SQL* Loader, SSMS, Win SCP, POSTMAN, Salesforce.

Platforms: Windows XP/Vista/7, UNIX, MS-DOS, and Linux.

Scheduling Tools: Redwood, Control-M, UC4, Autosys.

Version Control: GitHub, Informatica PowerCenter.

Application Packages: Familiarity with application and data model of SAP (ECC, CRM, HCM, APO, BPO, BW models, SQL* loader, Database import and Export.

Quality Management: JIRA

Incident/Log Tracking: Service Now.

Qualifications:

MBA from JNTU Kakinada,India

BSc in Computers from Acharya Nagarjuna University, India

Project Summary:

Project # 1:

Client: BCBS, Detroit, MI Nov’21 – Till now

Role: IICS ETL Developer

Roles and Responsibilities:

Understanding the business rules from the functional specifications and converting them into technical documents to implement the ETL packages.

Created Standard OAuth and Application OAuth connections for various salesforce organizations.

Extracted the raw data from Microsoft Dynamics CRM to staging tables and loaded the data to the salesforce objects using Standard and Bulk API techniques using Informatica Cloud.

Hands on experience to feed the data to the multiple salesforce application objects like (customers, Plans) based on the order of hierarchy using SCD1 techniques.

Created snowflake pipelines to load data into snowflake cloud database.

Developed the audit activity for all the cloud mappings and automated the validation reports on the daily basis and Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.

Implemented restart mechanism if any connection time out and dead lock failures in the task.

Created pipelines to migrate the code from DEV/QA/PROD environments.

Created the swagger files for Rest v2 connection to post and get the data from APIs to database.

Created the processes, sub-processes and Process objects and event-based processes, reusable service connectors and app Connectors to integrate the salesforce application.

Created the task flows to the post the data as batch using looping concept in cloud data integration to get back the data from SFDC objects.

Project # 2:

Client Bayer, Saint Louis, MO Apr 2020 – Oct 2021

Role: Sr. ETL Developer

Participate in requirements grooming session with Product Owners, Project Managers & other stakeholders

Involve in the application enhancements (estimation, design, coding, debugging, and documentation)

Participate in ETL dataflow reviews and design solutions to meet customer requirements

Create Informatica workflows and mappings to enable data from heterogenous sources like Flat files, Oracle 11g and Teradata into Datawarehouse

Develop ETL scripts using LINUX, Teradata Procedures, utilities - BTEQ, MLOAD, FASTEXPORT and FASTLOAD

Schedule the scripts using Autosys

Write SQL queries to perform ETL testing, data comparison and report testing

Perform SIT, UAT & Production data loads

Perform Code migration and support to UAT team

Work on the production issues & user requests

Perform analysis on recurring production issues and provide permanent fix

Provide Technical and functional guidance to Support and Development teams

Involve in the transition and SOP documents creation and providing KT to production support team

Provide reference manuals and training to the business users on the application functionality

Project 3

Client: Humana HealthCare, Louisville, KY. Feb 2018 – Mar 2020

ETL Developer

Role: ETL Informatica Developer

Roles and Responsibilities:

Understanding the business requirement.

Development of mappings and workflows using Informatica.

Creating UNIX,BTEQ, MLOAD & FASTLOAD scripts

End to end testing with data reconciliation for the mappings developed.

Prepared Technical design documents, HLD & LLD documents and Test cases.

Involved in the enhancement of CRs (design, coding, debugging, and documentation).

Handled Informatica Admin activities (Backup, restore, user creation, assigning permissions to users etc.)

Provided Informatica 8x to 9x version upgrade support

Supported SIT, UAT & Production data loads

Involved in the transition and SOP documents creation and providing KT to production support team

Involved in production support activity like batch monitoring, bug fixes and user requests

Performing Code migration and support to UAT team.

Implemented Type1, Type2, Type 6 and Incremental load strategies.

Followed ETL standards - Audit activity, Job control tables and session validations.

Used Exceptional mappings to generate dynamic parameter files after the staging loads.

Used Shared folders to create reusable components (Source, target, email tasks).

Developed single dynamic ETL mapping to load more than 30 reference tables.

Used Sql queries to validate the data after loading.

Experienced in integration of Confidential with external applications by using Restful API.

Maintained and automated the production database to retain the history backup tables to 30 days.

Developed/Automated (PowerCenter/IDQ/Cloud/Sales force) jobs through Redwood.

Environment: Informatica Cloud, Informatica PowerCenter 10.2, SQL Server,Teardata, Redwood.

Project # 4

Merck, Rahway, NJ Dec 2012 – Feb 2015

ETL Developer

Roles & Responsibilities

Involved in understanding the client’s requirement

Involved in design, development, implementation, acceptance testing and other quality related activities.

Involved in design and developing of ETL mappings, sessions and workflows

Involved in review and turning the complex mappings to increase the overall performance of the mapping.

Involved in preparing the unix scripts and scheduling jobs in Autosys

Interacting with the On-site co-coordinator for various ETL related Issues.

Preparing of various documentation of the Unit Test Case Specifications, design documents and other project related documents.

Tools/Environment: Informatica Power Center 9.1, UNIX, Oracle 10g, Autosys



Contact this candidate