Post Job Free

Resume

Sign in

Software Development Mulesoft Developer

Location:
Fuquay-Varina, NC
Posted:
January 03, 2024

Contact this candidate

Resume:

Sai Nikhil Reddy Sareddy

Senior MuleSoft Developer

984-***-****

ad2e7t@r.postjobfree.com

LinkedIn

CAREER SUMMARY

Over 6+ years of Professional IT experience with 5+ years of experience in all phases of software development life cycle (SDLC) including analysis, design, implementation, deployment, and support of Enterprise-based, EAI, and Web-based applications using Mule ESB.

Extensively worked on Mule architecture including Mule ESB, Any point studio, API kit, API Gateway, flow, and various Connectors.

Experienced in database systems like Salesforce, Dynamo, Postgres, and Neptune to manage tables.

Mule Connectors like FTP, FILE Read, SFTP, Database, Salesforce, Kafka Queues, AWS SQS Queues, S3, Amazon Lambda.

Extensive experience in designing, developing, and managing APIs using Mule API gateway.

Expertise in performing Data migration from Mainframe system to Open Text using Mule ESB.

Experienced troubleshooting Mule ESB, including working with debuggers, flow analyzers, and configuration tools.

Expert in using the DataWeave expressions for the transformations.

Enhanced the API’s performance by running the performance tests and optimizing the code.

Involved in Data Management tasks like Bulk Data Inserts and bulk Data Updates using Enterprise Integration Manager.

Well-versed in building integration solutions using MuleSoft Anypoint Platform, OAS, RAML, Anypoint Studio, and CloudHub among other tools in the MuleSoft portfolio.

Expertise in implementation of API-Led architecture.

Experienced with various Mule processing strategies and batch processing concepts.

Implemented API proxies as a layer of separation and SLA tiers, and policies to govern API using the Anypoint API Manager and Anypoint Runtime Manager.

Implemented policies like OAUTH, Basic Authentication, and Client ID Enforcement.

Implementation of Fault-tolerant APIs without interruptions of network & functionality.

Designed OAS, and RAML specifications using the Design centre and implemented Mule APIs in both Anypoint Platform and Anypoint Studio.

Expertise in error handling at various levels: application, flow, and process level.

Secured Applications using Keystores and Truststores.

Expertise in creating various M-unit test cases for Mule flows to attain optimal code coverage.

Expert in creating the documentation for the implementation of Endpoints.

Experienced in designing reusable assets, components, templates, and standards to support and facilitate API integration projects.

Played a key role during the production run.

Building CI/CD workflows to automate the mule app deployments to CloudHub with GitHub, and Jenkins.

Participates in Agile and Scrum Methodology with daily stand-ups, weekly planning games, and retrospectives.

Experience in using version control and configuration management tools like TFS, Bitbucket, and GIT.

Expert in providing the Knowledge transfer to the peer team members.

TECHNICAL EXPERTISE

Integration Tools

Mule ESB, Anypoint studio, REST, Postman, CloudHub, Kibanna, Splunk, Jira, Confluence, Github, Bitbucket.

Database

Postgres, Dynamo DB, Neptune DB

Languages

DataWeave, RAML, OAS, .Net

Internet

JSON, HTML, XML

Runtime

4.X,3.X

Education:

Master’s in information systems from Trine University, Detroit [Dec 2023].

Bachelor of Electrical and Electronics Engineering from Sreenidhi Institute of Science and Technology, Hyderabad [May 2017].

Project Details

Cognizant Technology Solutions, India January 2021- August 2022

Client: Resolution Life

MuleSoft Developer

Responsibilities:

•Actively involved in all phases of SDLC (Analysis, Design. Implementation, and Deployment) for the project.

•Implemented three-layer API-led architecture and extensively worked on system and process layers of different APIs. Created MULE ESB artifact and configured the MULE configurations files and deployed them.

•Helped junior resources in the team in providing the knowledge transfer and trained them in MuleSoft.

•Helped the MuleSoft Architect in the creation of architecture for the projects.

•Worked closely with customers and Business Analysts for requirements gathering and transforming the requirements into technical design.

•Worked closely with CRL architects and platform team daily to go over the development status and discuss any technical issues.

•Had a bird’s eye view of the entire architecture, forecasted any impediments, resolved those promptly, and created Technical Design Documents for multiple APIs.

•As a Developer actively participated in the Agile meetings and was involved in major discussions.

•Created multiple DataWeave functions and solved many complex logics during the transformations.

•Created major POCs for using the Neptune Database, moving the API policies to the property files, modifying the trust Stores, and using the Kafka connectors.

•Created a Global exception strategy for all the packages which includes complex APIs to achieve exceptional service and supported all integrations with proper error messages that included root cause identification and solution recommendation.

•Collaborate with other technical functional teams such as DevOps, Front-end teams, Architects, and MuleSoft Support teams for release and production support.

•Collaborate with Applications Development and QA teams to establish best practices.

•MUnit testing of all the scenarios.

•Designed and Implemented RESTFUL Web Services using various data formats (JSON, XML) to provide the interface to the various third-party applications.

•Utilized BitBucket as the version control system and maintained code repository with changes being done parallel by different teams.

Environment: MuleSoft ESB 4.X, Any Point Studio, OAS, JSON, XML, JDBC, DataWeave, CloudHub, Kafka, MUnit, Maven, Jenkins.

Accenture Solutions Private Ltd, India July 2018 – January 2021

Client: Schlumberger

Application Development Analyst

Responsibilities:

Involved in Requirement Gathering, analysis, design, and development of the project.

Executed the development process using Agile methodology, which included iterative application development, monthly Sprints, stand-up meetings, poker planning sessions, elaboration sessions, and customer reporting.

Designed, developed, and mocked APIs using RAML on the design center.

Used Aws Glue jobs for running the Amazon Lambda Functions.

Worked with API manager for the application of policies to API as well for managing the security. Also worked with the Proxy settings using the API Gateway for the API’s.

Designed and implemented integration with Salesforce using MuleSoft, including Change Data Capture (CDC) events and platform events for real-time data synchronization and event-driven processes.

Used Data Weave for data transformations and data validations within flows and sub-flows.

Developed Rest API leveraging SQS connector for applications to push messages into AWS.

Developed listener flows leveraging SQS listeners to read messages from AWS.

Involved in creating http inbound and outbound flows, transformers, filtering, and Security of Mule Flows.

Extensively used Mule components that include FTP/SFTP Transport, and database Connector.

Developed RESTful/SOAP web services in Mule soft based on SOA architecture.

Developed Rest API using the Postgres Databases. Created multiple composite indexes for improving the performance of the Application.

Created test cases using MUnit to unit test the flows.

Established seamless connectivity with AWS S3, enabling secure and efficient storage and retrieval of data.

Integrated SAP systems to facilitate data exchange between NYL's core business processes and SAP, ensuring consistent and real-time data updates.

Orchestrated data integration with the Data Warehouse, optimizing data reporting and analytics capabilities for better decision-making.

Conducted Business Requirements Gathering sessions, Questionnaires, Focus Groups, and Brainstorming with the end users, subject matter experts (SME), and developers.

Analysed the Business Requirement Document (BRD) and documented meeting minutes, and responses to questionnaires and got approval of the documents from the client in the form of sign-off.

Created a Software Requirement Specification (SRS) document to describe all data, functional, nonfunctional, and behavioral requirements of the system by the Business Requirement Document.

Promoted SharePoint to gather and translate the requirements.

Used SharePoint to provide specialized consulting services to businesses regarding best practices, design ideas, and information architecture to Management.

Attending defect triage meetings and reporting with a strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Performing Queue-based and file-based application testing by validating transactions through Splunk logs.

Develop Automation framework implementing Page Object Model, using JAVA, Selenium Web driver.

Planning and retrospective sessions, requirements ambiguity, use-case reviews, and product backlog grooming.

Coordinating with the Offshore Quality Assurance team to achieve sprint deliverables and provide any concerns or issues to project management and clients as required.

Providing a demo to stakeholders on test artifacts at the end of every sprint.

Supporting all production checks out for all Digital and Enterprise Integration applications and providing QA sign-off.

Environment: MuleSoft 4.x, Anypoint Studio, REST, SOAP, Salesforce, AWS S3, SQS, DataWeave, MUnit, GitHub, CloudHub, VM, Postgres, RAML, JSON, MUnit

Accenture Solutions Private Ltd, India June 2017 – July 2018

Client: Hess

Application Development Analyst

Responsibilities:

Followed the guidelines of agile methodologies of SDLC for project management.

Worked in end-to-end project lifecycle covering Requirement analysis, design, build, unit testing, and deployment.

Played a major role during the Go-Live of the project in the initial releases and improved the performance of the Mule app when using the S3 connectors.

Implemented three-layer API-led architecture and extensively worked on system and process layers of different APIs. Created MULE ESB artifact and configured the MULE configurations files and deployed them.

Handled message exception inflows and used global exception handlers.

Implemented Standard guideline approach for Integration developments by Flows, Sub flows, global configurations, global exception handling strategies, and property placeholders.

Created major POCs for using the Postgres Database.

Good Knowledge of using Mule Connectors like AWS S3, HTTP, FTP, File, Database.

Unit testing of all services.

Refactored the mule flows to use RAML.

Utilized GitHub as the version control system and maintained code repository with changes being done parallel by different teams.

Environment: MuleSoft ESB 4.x, Any Point Studio, Kafka, Jenkins, RAML, JSON, XML, JDBC, DataWeave, CloudHub,

MUnit, Maven.

BHEL, India July 2016 – April 2017

Responsibilities:

•Responsible for designing, prototype development, building, testing, and maintaining the complex electrical parts and networks that power our world, impacting every industry from aerospace and telecommunications to artificial intelligence (AI) and robotics.

•Used AUTO CAD a computer-aided tool to create and test the Design of the equipment.



Contact this candidate