Post Job Free
Sign in

Data Management Integration

Location:
Bridgewater, NJ
Posted:
September 10, 2024

Contact this candidate

Resume:

Prasad Thoudoju

***********@*****.***

+1-908-***-****

Professional Summary

Seasoned Data Integration Specialist with nearly 10 years of extensive experience in designing, developing, supporting, and maintaining data integration solutions using Informatica services and associated technologies. Adept in leveraging Informatica Power Center (9.x/10.x), Intelligent Data Management Cloud (IDMC), and various data management tools and platforms to create efficient ETL processes and manage complex data flows. Key Skills and Expertise

Data Integration and ETL Development: Demonstrated expertise in designing and developing Extract, Transform, Load (ETL) processes from diverse data sources to multiple target systems, including Oracle, Salesforce.com

(SFDC), SQL Server, SAP, AWS S3, Snowflake, and various flat files, REST APIs, JSON, and XML files.

Informatica Services Proficiency: Advanced proficiency in Informatica Data Management Cloud (IDMC) services such as Data Synchronization Services (DSS), Data Replication Services (DRS), Mappings, Mapping Tasks, Task Flows, Service Connectors, and Business Process management. Skilled in creating Swagger files and configuring Mass Ingestion.

Data Warehouse and SCD Design: Expertise in designing and implementing Slowly Changing Dimensions (SCD) types 1, 2, and 3. Solid understanding of data warehousing concepts and practices.

Optimization and Standardization: Enhanced over 15 complex data flows using pushdown optimization techniques and established standardized processes for managing data integration systems to ensure consistency and efficiency.

Documentation and Artifacts: Proficient in authoring key project artifacts including Technical Design Documents

(TDD), Source to Target Mapping Documents (STTM), Migration Documents, Test Case Scenarios, User Acceptance Testing (UAT) plans, and Operational Support Documents compliant with GDPR.

Visualization and Architecture: Skilled in creating detailed architecture diagrams using tools such as Microsoft Visio and Lucidchart.

Domain Expertise: Extensive experience working in Life Sciences, Pharma, Research & Development, and Clinical Data domains, adhering to Good Automated Manufacturing Practice (GxP) and Computer System Validation (CSV) processes.

Salesforce Integration and Master Data Management: Integrated Salesforce CRM objects with Veeva Vault applications and managed Master Data Management (MDM) objects within data systems.

Migration and Programming: Successfully migrated Informatica Power Center 10.x code to IDMC, including thorough execution planning, testing, and workflow redesign. Developed dynamic parameterization for incremental data loading using Python programming.

Cloud and BI Tools: Experience with Python programming, AWS EC2, AWS S3, Salesforce Administration, PowerBI, and Tableau. Managed data loads in Snowflake using dynamic mappings and IICS with Data Cloud connectors, and queried data from AWS S3 to Snowflake.

Proof of Concept and Agile Methodologies: Presented proof of concept for new business requirements and integration services. Experienced in both Waterfall and Agile SDLC models, using tools like Jira and ServiceNow for project management and collaboration.

Version Control and Collaboration: Proficient in version control systems like Git for effective code collaboration, versioning, and continuous integration.

Problem Solving and Communication: Excellent problem-solving and analytical skills with the ability to handle high-pressure, fast-paced work environments. Strong communicator, effectively collaborating with teams to deliver high-quality applications and meet project deadlines.

Continuous Learning: Proactive in staying updated with the latest Informatica releases and industry trends, continuously enhancing technical skills to adapt to evolving data industry requirements. Technical Skills:

ETL Tools: Informatica PowerCenter 9.x / 10.x, IDMC Services (Data Integration, Application Integration, Mass Ingestion), Dell Boomi

Databases: Oracle11g, SAP BW on HANA, Hadoop (Hive), Snowflake Scripting Languages: PL/SQL, Shell Scripting (Unix / Linux), Python Job Scheduling Tools: Autosys, Crontab, Control-M

DevOpps Tools Jenkins CI/CD Pipeline, GitHub

Cloud platforms: AWS, Azure

Education:

June 2010 – May 2014 Bachelor of Technology in Electronics and Communication Engineering (ECE) from Rajiv Gandhi University of Knowledge Technologies, Basara, Nirmal, Telangana State, India

Certifications &

Trainings:

IDMC Data Integration & Application Integration Partner Training and certificate

Workato Automation Pro-I and Automation Pro-II

Snowflake Essentials, Data Sharing Certification from Snowflake Academy.

Dell Boomi Professional Developer, Associate Dell Boomi Developer, Dell Boomi B2B/EDI Fundamentals, Dell Boomi B2B/EDI Management

Salesforce Lightning Reports, Dashboards & List views by Udemy

AWS, Azure Cloud foundation courses on Udemy

AWS Solution Architect Certification (Udemy course in-progress)

Tableau Data Analyst Certification (Udemy course in-progress) Professional Experience:

Client: Alkermes

Role: Senior Technical Lead

Duration: Nov 2021 – Jun 2024

Tools Used: IICS CDI, CAI, Mass Ingestion, Snowflake, AWS, ServiceNow, Salesforce, Anaplan Responsibilities: Authored all project artifacts such as Integration Design Document, Source to Target Mapping Document, User Acceptance Test Scripts, Migration Document, Operational Support Document.

Designed ETL Processes according to business requirements, ensuring efficient data flow from source to destination.

Developed ETL Pipelines using IDMC Data integration to pull data from SAP, AWS S3 Files, MS SQL Server, sources to REST APIs to Snowflake data lake.

Oversaw production environment scheduled jobs and support if any issues on failed jobs.

Collaborated with Clients to get business requirements, estimate timelines with efforts and craft risk analysis on existing data flows.

Prepared all project documents and managed data integration activities by following GDP regulations.

Stayed up to date with new releases of IDMC and explore on new features and services.

Implemented dynamic parameterization using Python in the Manufacturing data domain from OSI-PI systems to Snowflake incremental data load.

Built data pipeline with AWS S3 Bucket flat files to Snowflake.

Ideated a data pipeline from Azure Blob Storage to Snowflake.

Configured Secure Agent in virtual machines to create up and running runtime environment for jobs to be deployed in Informatica cloud.

Migrated Informatica Power Center code to Informatica Cloud with thorough process of risk analysis, test plan, jobs execution, re-build of mappings/tasks/workflows.

Developed user defined functions, established source/target system connections, made informatica jobs successfully by redefining data flow architecture.

Worked on Quality, Manufacturing, Clinical, Research & Development, Medical Affairs, Operation and Finance data domains extensively with Source/Target Systems.

Handled Operation data domain sourced SAP HANA to Snowflake, Financial data domain sourced AWS S3 Flat files to Anaplan, Manufacturing data domain sourced OSI-PI SQL Server to Snowflake, Quality data domain sourced Salesforce Trackwise Digital to Snowflake and Commercial data domain from various data sources to Snowflake.

Integrated Quality data files to Sharepoint directory using data integration sharepoint online connection.

Client: Harvard Business Publisher

Role: Senior Data Consultant

Duration: Sept 2019 – April 2021

Tools Used: IICS CDI REST API, Oracle 11g, Unix Shell script, Postman Responsibilities: Developed integrations which sources CSV files and load them into the Higher Education website using 6 API calls.

Applied file validation rules to invalidate the data file and send notification email alerts with the invalid data file as an attachment with error reasons.

Extensively used the Postman tool to extract responses of REST API by passing the body in JSON format with authorization.

Developed Swagger files, RestV2 Connections, Business Services and Web Service Transformations for REST APIs to invoke from CDI

Utilized hierarchy builder transformation in mappings to send the request payload into the Web Service transformation.

Executed custom source queries in mappings to facilitate the source records flow from the staging table to REST APIs

Wrote Shell Scripts for custom email alerts and job dependency checks. Client: Athenahealth

Role: Senior Data Consultant

Duration: May 2019 – Aug 2019

Tools Used: IICS CDI REST API, CAI, Oracle 11g, Unix Shell Script Responsibilities: Developed processes for Business REST APIs to integrate between Salesforce CRM and ServiceNow, and applied fault-handling scenarios in all internal sub-processes.

Configured email alerts to make them self-invoking by default in the event of any process fail, and developed exceptions using the type "Throw" in Application integration.

Used Query Objects to join multiple objects in SFDC for required output fields.

Created/modified Swagger files, based on incident number responses, to pass input parameters like authentication, input fields, subdomain URL to the host URL.

Based on Swagger files, created RestV2 connections and business services that were used for Web Service transformation in the data integration of Informatica Cloud

Documented all the project artifacts by following GxP guidelines and regulations

Co-ordinated team size of 4 with development, testing, and support activities Client: PayPal

Role: Senior Data Consultant

Duration: April 2018 – April 2019

Tools Used: Informatica PC 10.x, SAP HANA, Unix Shell Scripting, DevOps Migration tools, JIRA Responsibilities: Analyzed source data of all payment-related jobs to fix errors and ensure data availability before business cut-over time.

Coordinated with the Command Center Team to re-run/hold/kill the jobs as per the source data arrival/due to long-run/outages/maintenance

Performed statistical validation in PSA and DSO layers of SAP BW on HANA

Remediated data from archived tables which are in Hadoop to the DSL layer of HANA

Generated reports and statistics at the end of the data load

Reviewed the status of jobs in Control-M to confirm downstream.

Produced standard operating documents detailing the challenges with solutions and enhancements.

Updated the Confluence page with the details required by the client along with learnings

Deployed code in pre-production & production environments using Olympus tool and tracked it in Jira

Client: Merck Co.

Role: Data Integration Developer

Duration: Feb 2015 – April 2018

Tools Used: Informatica PC 9.x/10.x, Unix Shell Scripting, SFDC, Oracle 11g Database Responsibilities: Coordinated with the onsite team to create functional specifications using business requirements gathered from the client by following GxP and GDP compliances.

Developed Informatica mappings using various transformations for extraction, transformation and loading of data from multiple sources to the data warehouse using complex transformations

Defined ETL processes to load data from various systems such as Oracle, Flat Files, SFDC, SAP, and XML

Extensively used mapping parameters and variables to provide the flexibility

Fine-tuned the performance of Informatica components for daily incremental loading

Scheduled all Informatica jobs in Autosys/Crontab using Shell scripts

Created a notification system in UNIX shell scripting, providing the client with details of the interface run

Processed CRM data to many objects of Salesforce by using Informatica mappings and data loader manually. Made sure that, data got reflected in VEEVA CRM applications.

Created Users, Territories, Accounts, Contacts, Products in the VEEVA application by processing Flat Files which are provided by Sales representatvies of each courtry configured in CRM application.

Implemented a generalized method of scripting and mapping for around 30 countries

Reproduced interface behaviors to identify the root-cause of problems and developed workarounds/solutions

Conducted knowledge transfer sessions for end-users, created comprehensive documentation outlining the design, development, implementation, daily loads and process flow of the mappings

Logged user (client) support cases through Salesforce tickets to analyze and provide solutions

Worked on Hadoop connectivity from Informatica

Imported HDFS tables as a target for populating data

Created Shell scripts for pushing files into HDFS from LINUX servers

Configured the Kerberos network authentication protocol for a secure file transfer from LINUX Server to Hadoop system

Developed proof-of-concept for a broad range of scenarios in IDMC with Data Replication Services (DRS), Data Synchronization Services(DSS)

Elicited and analyzed business requirements, created and delivered proof-of-concept to the client well ahead of the deadlines

Extensively worked on Web Service transformation in Informatica 9.6

Developed mappings in response to client’s requirements

Extracted data from source (WSDL) for loading them into the target (WSDL) system, which in turn, was connected to SAP systems

Developed proof-of-concept on Informatica Cloud to load data from Salesforce into Oracle tables and from WSDL to Oracle



Contact this candidate