Moulika Singa Reddy
Sr. Data Engineer / Mule ESB Integration Developer
Email. ****************@*****.***
Professional Summary:
7+ years of experience in IT, specializing in MuleSoft (Mule 3.x & 4.x) for API-led integration, ervice- Oriented Architecture (SOA) and Enterprise Application Integration (EAI), with strong background in Java/J2EE technologies and legacy system upgrades.
Implemented API’s using API-Led connectivity, API design fast approach using various MuleSoft product stack like Designer, Exchange, API gateway etc.
Successfully deployed and managed scalable integrations across CloudHub, RunTime Fabric (RTF), and on-prem environments with zero-downtime deployments in high-availability architectures.
Successfully migrated applications from CloudHub 1.0 to CloudHub 2.0, including runtime upgrades from Mule 4.4.2 to 4.9.3 and Java 8 to Java 17, ensuring improved performance, security, and compatibility with next-gen MuleSoft infrastructure.
Strong experience with Snowflake – data modeling, transformation, ingestion via COPY, Snowpipe, and external/internal stages from AWS S3.
Used Snow Pipe for continuous data ingestion from the S3 bucket
Proficient with DBT for transforming data in Snowflake, building modular, testable pipelines.
Extensively worked on Mule DataWeave components/functions to create mappings, transform, group and split data and with Mule API manager and creating/managing RAML api specs’s.
Experience integrating various systems such as Salesforce, SAP, databases (Oracle, MySQL, DynamoDB), and message queues (Anypoint MQ, IBM MQ, Azure Service Bus).
Integrated Salesforce with MuleSoft using various components including Salesforce Query All, Bulk API for high-volume data operations, and Platform Events for real-time event-driven integrations; enabled seamless two-way sync between systems with optimized performance and reliability.
Strong background in SOAP/REST services, XML technologies (XSD, XSLT, XPath), and API gateway configuration and governance.
Widely experienced with Mule ESB in designing and implementing core platform components for API/Services Gateway as well as other technological cross-cutting capabilities and a contributor to Mule ESB open-source.
Knowledgeable in AWS and Azure platforms and used connectors for integrations in cloud environments.
Have experience in integrating with DevOps practices using Jenkins, Azure DevOps, and source control tools like GitHub and Bitbucket.
Strong debugging and problem-solving skills with excellent understanding of system integration methodologies, techniques, and tools.
Involved in complete SDLC including Analysis, Design, Implementation, Testing and Maintenance.
Effective collaborator with offshore teams, SMEs, and BAs across full SDLC; recognized for strong debugging, troubleshooting, and performance tuning skills.
Experienced in working with business analysts, development groups in analyzing business specifications and in working with other testers to resolve technical and end-user issues.
Effective communication and interpersonal skills, ability to work in a team/independently and adapt quickly to new and emerging technologies.
Technical Skills:
Integration& API Platforms
MuleSoft (Mule ESB 4.4.0, 4.3.0, 4.2.2, 4.2.1, 4.2.0, 4.1.5, 3.9, 3.8, 3.7, 3.6, 3.5, 3.3, 3.2), Anypoint Studio, CloudHub (1.0 & 2.0), Runtime Fabric (RTF), API Manager, MMC, RAML, DataWeave, ActiveMQ, JMS API
Web&Integration
Technologies
Java, J2EE, JSP, JavaScript, C, C++, PL/SQL, REST, SOAP, JAX-WS, JAX-RS, JAX-B, JAX- P, HTML, CSS, XML, XSD, JSON, AJAX
Version Control & DevOps
Tools
Git, Bitbucket, SVN, Azure DevOps, Jenkins, Maven
Languages C, C++, J2EE, JavaScript, JSP and PL/SQL, Azure Devops Database & Storage MYSQL, ORACLE, MS-Access
Operating System Windows 7/8/10, LINUX/RHEL and Unix. Other Tools Shell Scripting, Maven, Jenkins, JIRA. Project Experience:
Client: Holt Cat Date: Oct2023 - Present
Location: Dallas, TX.
Role: Sr. Software Engineer/Mulesoft Developer
Responsibilities:
Developed, evaluated, and influenced effective and consistent productivity and teamwork to ensure the delivery of Legendary Customer Service (LCS)
Designed and developed end-to-end MuleSoft APIs and integrations using Anypoint Platform, following API-led connectivity principles with strong use of DataWeave, MUnit, reusable flows, and exception handling.
Ensured adherence to enterprise standards in project execution methodology, technical delivery, and standards compliance; collaborate on project estimation and quoting to ensure successful delivery.
Worked collaboratively with stakeholders to identify and define the business requirements. Analyze and translate business requirements using frameworks into components of a modernized solution and an integration architecture to automate business process.
Defined and documented integration strategies, data flows, and technical specifications to support cross- functional, multi-system solutions, aligned with enterprise architecture and standards.
Designed and helped to build real time and asynchronous enterprise automations solutions that increases business value using industry best integration tools/platforms.
Developed efficient, well-structured, reusable, and scalable automation processes and integrations. Authors and maintains solution design documentation.
Build detailed test plans and automation scripts for integration validation and supported QA/UAT cycles to guarantee high-quality releases.
Conducted impact assessments, root cause analysis, and provided remediation strategies for integration risks and failures; ensured strong governance and operational readiness.
Worked with internal teams to ensure proper monitoring, error handling and reporting is in place before the integration goes live.
Designed and implemented real-time and asynchronous enterprise automation solutions, integrating systems like Salesforce, RentalMan ERP, and Microsoft Dynamics 365 F&O, enhancing data consistency and business value.
Integrated enterprise systems with Salesforce using MuleSoft’s Salesforce Connector, leveraging Query All, Bulk API, and Platform Events for scalable data exchange and real-time event handling.
Extensively worked on Mule DataWeave 2.0 components/functions to create mappings, transform, group and split data.
Led the migration of MuleSoft applications from CloudHub 1.0 to CloudHub 2.0, including runtime upgrades (Mule 4.4.2 to 4.9.3) and Java version upgrades (Java 8 to Java 17), ensuring seamless deployments and improved scalability and security.
Understanding data transformation and translation requirements and which tools to leverage to get the job done in Snowflake.
Built scalable data pipelines in Snowflake, using DBT for model development and transformations; leveraged COPY, Snowpipe, and S3 staging for continuous data ingestion and ELT processing.
Experience in understanding Business Requirements, perform impact analysis, and assess technical challenges/feasibility in achieving required business outcomes.
Implemented monitoring, alerting, and logging for integrations using Smart Monitoring, custom alerts, and flow metrics to proactively identify issues and optimize performance.
Provided ongoing technical leadership and mentorship to developers, promoting best practices in integration, data engineering, and SOA, while collaborating with business and IT stakeholders.
Demonstrated strong knowledge in middleware performance tuning, runtime server management, and secure integration architectures, maintaining high system reliability and compliance. Environment: AnypointStudio 7.11.1, 6.2.3, Mule Runtime 4.x,3.x, Cloudhub(1.0,2.0), Java, Windows 10, Mule 4.x, On-Premise, RAML, SOAP, REST, WSDL, IBM MQ, JMS Q, Jenkins, Github, BitBucket, Http, XML, XSL, XSD, XPath, XQuery, JSON, Azure, Oracle DB/SQL, Postman / SOAP UI, Snowflake,DBT, Salesforce,D365. Client: Bank Of America Date: Oct2022 – Oct2023
Location: Charlotte, NC.
Role: Sr.MuleSoft Developer
Description: Designed a framework between Downstream systems, Upstream Systems, IBM MQ, Java and Oracle DB systems.
Responsibilities:
Designing and implementing the Mule API’s in 3.9.0, 4.3.0.
Worked under AGILE development process has been followed. Experienced with stand-up, Retrospective, Demo, Planning and Code review meetings.
Designed the HTTP REST and WebService protocols and different data formats including JSON, XML, XSD, XSLT, and SOAP.
Designed REST based API’s using JAVA and API KIT ROUTER.
Implemented the project under the guidelines of agile methodologies of SDLC.
Developing and deploying mule project to On-Premise by using XLR Deployments, CI/CD and VanGuard tool.
Co-authored technical specification documents for the developed modules.
Developed mule ESB projects for the services with synchronous and asynchronous flows.
Designed and developed enterprise services using RAML and REST based API's.
Published RAML API's as an Endpoint Proxy via the API Gateway and deployed the Mule Deployable Archive in the Mule Management Console (MMC).
Implemented Security Mechanisms like Security Certificates, Key Exchange, Encryption, Description and OAuth Authentication & Authorization using Access Token and SM Session Cookie.
Implemented MULE Batch Processing to process records from Database/ downstream systems.
Utilized various connectors such as FTP, HTTP, File, SMTP, SFTP, and Database in different workflows.
Implemented choice, global, custom exception handling mechanisms based on the business requirement in Mule ESB with transformers, scopes, exception handling, testing & Security of Mule ESB endpoint.
Utilized the following Mule components extensively for development – SAP connector, Choice router, Scatter- Gather router, Expressions connector for plugging in custom Java expressions, Dataweave Transformers, Azure Connector etc. And used MEL (Mule Expression Language) to manipulate payloads.
Experienced in using Java to create Classes, Java Bean and spring boot framework.
Built flows for Oracle database where the other teams will use the URL by giving values to the provided parameters.
Strong experience in parsing & generating data/files, standards using Data Weave and other Mulesoft connectors
Experienced with Monitoring tools like DynaTrace and Introscope
Designing and developing CI/CD architecture using Jenkins, bitbucket, etc using Ant/Maven scripts and Used Azure dev ops for Repo for promoting code to UAT, production.
Coordinated in all testing phases and worked closely with Performance testing team to rrot the issues and resolve. Involved in bug fixes and production support.
Experienced With development tool sets and platforms in mulesoft ESB by working with SOAP/ODATA web service.
Utilized SOAP UI/Postman tool for testing the URLs by using get and post methods selection and used the basic authentication.
Independent worker with strong troubleshooting skills and Strong communication skills to document/communicate issues, status
Worked with offshore / onshore and task co-ordination us made between . Environment: AnypointStudio 7.11.1, 6.2.3, Mule Runtime 4.3.0, 3.9.0, Java, Windows 10, Mule 4.x, On-Premise, RAML, SOAP, REST, WSDL, IBM MQ, JMS Q, Jenkins, Github, BitBucket, Http, XML, XSL, XSD, XPath, XQuery, JSON, Azure, Oracle DB/SQL, Postman / SOAP UI, DynaTrace. Client: Wells Fargo Date: Nov2020- Sep2022
Location: Charlotte, NC.
Role: Sr.MuleSoft Developer
Description: Designed a framework between Salesforce, Downstream systems, IBM MQ and Oracle DB systems and Odata.
Responsibilities:
Designing and implementing the Mule API’s in 4.2.2,4.3.0,4.4.0.
Worked under AGILE development process has been followed. Experienced with stand-up, Retrospective, Demo, Planning and Code review meetings.
Developed the flows/orchestrations for integrating the components written on top of different internal platforms using Mule ESB and IBMMQ.
Design and implanted RESTFUL Web Services using various data formats (JSON, XML) to provide an interface to the various third-party applications.
Developed Rest based APIs using API KIT ROUTER.
Created credentials vault and encryption process for the payload.
Created batch processing for performing ETL operations from SFDC to downstream systems.
Developing and deploying mule project to On-Premise by using snapshots wherever necessary to use them as maintenance builds.
Worked on IIB flows using XSLT, XML for RESTful & SOAP web services
Developed mule ESB projects for the services with synchronous and asynchronous flows.
Designed and developed enterprise services using RAML and REST based API's.
Worked on the Mule API Gateway for the application of policies to API as well for managing the security. Also worked with the Proxy settings using the API Gateway for the API’s.
Done with transformers, exception handling, testing & Security of Mule ESB endpoint.
Utilized the following Mule components extensively for development – Salesforce connector, SAP connector, Choice router, Scatter-Gather router, Expressions connector for plugging in custom Java expressions, Dataweave Transformers, etc. And used MEL (Mule Expression Language) to manipulate payloads.
Developed flows/orchestrations for integrating the components like connectors, transformers and scopes written on top of different internal platforms using Mule ESB for XML to CSV conversion.
Experienced in using Terraform to create SQL files and up and down DB scripts.
Designing and developing CI/CD architecture with Azure for Jenkins, bitbucket, etc
Coordinated in all testing phases and worked closely with Performance testing team to create a baseline for the new application.
Experienced With development tool sets and platforms in mulesoft ESB by working with SOAP/ODATA web service.
Utilized Postman tool for testing the URLs by using get and post methods selection and used the basic authentication.
Environment: AnypointStudio 7.11.1, Mule Runtime 4.3.0, 4.2.2, Windows 10, Mule 4.x, RAML, SOAP, REST, WSDL, ActiveMQ, IIB, On-Premise, Jenkins, Github, BitBucket, Cloub Hub, Http, XML, XSL, XSD, XSLT, XPath, XQuery, Azure, JSON, Oracle SOA.
Client: Mc Donald’s Date: Nov 2018- Nov 2020
Location: Chicago, IL.
Role: Senior Mule ESB Developer
Description: Designed a Framework for the bi-directional data integration. Responsibilities:
Involved in requirements gathering, analysis of existing Design Documents, planning, proposing changes by analyzing new Wire Frames, Development and Testing of the application.
Implemented the project under the guidelines of agile methodologies of SDLC.
Co-authored technical specification documents for the developed modules.
Worked with an offshore Mule team and collaborating with them on deliverables.
Participated in daily scrum calls and bi-weekly sprints.
Designed and developed integration applications using Mule to integrate SAP Ariba and Salesforce applications using Mule ESB 4.1 runtime, Anypoint Studio 7.5 and Cloudhub.
Implemented Security Mechanisms like Security Certificates, Key Exchange, Encryption, Description and OAuth Authentication & Authorization using Access Token and SM Session Cookie.
Used MULE Batch Processing to process records from Database.
Utilized various connectors such as FTP, HTTP, File, SMTP, SFTP, Servlet, Quartz, and Database in different workflows.
Implemented choice, global, custom exception handling mechanisms based on the business requirement in Mule ESB,
Created new and updated existing APIs using RAML 1.0 and API KIT for defining integration schematics.
Built flows for Oracle database where the other teams will use the URL by giving values to the provided parameters.
Implemented synchronous and Asynchronous messaging scopes using Active MQ in Mule.
Experience in deployment of Mule ESB applications using MMC and CloudHub.
Used Database Connectors to connect with respective systems using Mule ESB.
Configuring the Mule process for fetching the data from topic and makes web service calls to the middle tier Mule ESB for processing.
Troubleshot issues detecting in development and testing phase by logging into Salesforce and reviewing data like job ids, custom objects, status of bulk Insert/Upsert batch jobs, etc.
Created views and some records on Salesforce side to test the flows from Salesforce to SAP end.
Worked with Build Teams did Mule administration, configuration and tuning. Deployed Mule applications on Mule 4.1 On Premise through MMC and managed the server deployments.
Used SourceTree to commit to GIT and migrated the developed services from GIT to BitBucket
Utilized Github as the version control system and maintain code repository with changes being done parallel by onshore as well as offshore teams.
Used Maven for Deployment, Jenkins for Continuous Integration and Continuous Development. Environment: Java 1.8, Anypoint Studio 7.2, MS-Access, Mule ESB 4.1.0, ActiveMQ 5.3, Apache-Maven 3.5.4, Nexus, RAML, Cloudhub, Log4j, GIT, JIRA, API Gateway2.1, BitBucket, Postman Confluence, Jenkins. Client: AIG Date: Oct 2017 – Nov 2018
Location: Houston, TX.
Role: Mule ESB Developer
Description: Integrated customer related policy information which is coming from different channels with downstream systems in AIG
Responsibilities:
Used Agile methodology was to emphasize on face-to-face communication over written documents and make sure that iteration is passing through a full software development cycle.
Designed UML diagrams like Use case, Class and Activity diagram based on the requirements gathered.
Deployed Mule ESB applications into Cloud hub.
Extensively used Mule components that include File, SMTP, FTP, SFTP, JDBC Connector and Transaction Manager.
Installation and configuration of Development Environment using Anypoint studio with Mule Application server on cloud and premises.
Created Mule ESB artifact and configured the Mule configurations files and deployed.
Configured the Mule process for fetching the data from topic and makes web service calls to the middle tier Mule ESB for processing. Worked on MULE API manager and RAML (RESTful API modeling Language).
Created REST API's using RAML and developed flows using APIKIT Router.
Performed integrated with other sub-systems through JMS, XML and XSLT.
Developed an Integration Component with third party application using Mule ESB
Used GIT as version controlling tool and Maven for building and deployed application to Cloudhub.
Integrating the salesforce with Mulesoft for connecting applications, data sources and APIs with cloud.
Experience in integrating Cloud applications like SFDC and Cloud Database using Mule ESB.
Utilized custom logging framework for Mule ESB application.
Strong application integration experience using Mule ESB with connectors, transformations, Routing, ActiveMQ, JMS.
Integration of Mule ESB system while utilizing MQ Series, Http, File system and SFTP transports.
Integrating data using Sales Force, JMS, HTTP, Web-Service connectors.
Involved in Transformations using XSLT, Data Weave, and Custom Java Transformers to transform data from one format to another format using Mule ESB.
Used Facets for Improving the member experience
Developed Mule flows to integrate Data from various sources into Database, from ActiveMQ topics and queues, some transformations were also done at the integration layer.
Used Log4j to capture the log that includes runtime exception and for logging info and are helpful in debugging the issues.
Environment: Mule ESB, Mule Management Console, Anypoint Studio, Apache-Maven, ActiveMQ, Cloudhub, RAML, Nexus, Putty, XML, JSON, PL/SQL, SQL, Log4J, CVS.
Client: Govind Development, L.L.C Date: Jan2017 – Oct 2017 Location: Corpus Christi, TX.
Role: Java Developer
Responsibilities:
Involved in Design, Development, Testing, and Integration of the application.
Used Eclipse IDE to develop the application.
Developed customized reports and Unit Testing using JUnit.
Developed SQL queries to store and retrieve data from database & used PL -SQL.
Used HTML, DHTML, Java script and AJAX for implementing dynamic media play outs.
Used simple Struts Validation for validation of user input as per the business logic and initial data loading.
Involved in managing Business delegate to maintain decupling between presentation & Business layers.
Used JMS for Asynchronous messaging.
Involved in fixing defects and tracked them using QC and provided support and maintenance and customization.
Involved in writing JUNIT test cases and Code version controlling using SVN.
Involved in building the code using Ant and the deployment.
Involved in building the code & deploying on the server.
Ability to quickly adjust priorities and take on projects with limited specifications.
Effective team player with excellent logical and analytical abilities.
Involved in fixing IST, QA, UAT & Production defects.
Following Coding guidelines & maintain quality of code.
Supported the applications through production and maintenance releases. Environment: Core Java, J2EE, JSP, Eclipse, Windows, Junit, JMS, XML, HTML, DHTML, DB2, Java Script, Shell, CSS, AJAX, SVN, Struts Validation Framework, XML, QC. Education:
Master’s: Texas A& M university, Computer Science (2017)
Bachelor: Electronics and communications, Vidya Jyothi Institute of Technology(2015) Certification:
MuleSoft Certified Developer (Level 1 and Level 2)