Post Job Free
Sign in

React Js Spring Boot Web Development

Location:
Rancho Cucamonga, CA
Posted:
February 11, 2025

Contact this candidate

Resume:

Harsh Dholakia *****.*@*********.** 361-***-****

Summary:

Over 7 years of experience in full-stack development, Java application development, and automation testing, delivering high-performance software solutions.

Proven expertise in designing and deploying scalable web applications using React.js, Spring Boot, Next.js, and Strapi CMS, enhancing customer engagement and operational efficiency.

Extensive experience in developing and maintaining RESTful APIs and implementing Microservices architecture for seamless third-party integrations.

Skilled in relational database design, SQL optimization, and improving query performance, achieving up to 40% reductions in data processing times.

Developed and maintained websites and CRM solutions using various tools built on the Salesforce platform.

Demonstrated success in CI/CD pipeline automation using Docker, Jenkins, and Azure DevOps, reducing deployment times by up to 40%.

Proficient in frontend technologies such as React.js, JavaScript, and Material-UI, ensuring user-centric and responsive web interfaces.

Expertise in test automation using Selenium, JUnit, and Postman, achieving 95% test coverage and reducing production defects.

Adept at working in Agile workflows, delivering innovative, compliant solutions while fostering collaboration across teams.

Skills:

Languages: Java, Python, C#, JavaScript, TypeScript, SQL, PHP

Frontend Frameworks: React.js, Next.js, AngularJS

Backend Frameworks: Spring Boot, Node.js, ASP.NET MVC

CMS: Strapi, Sitefinity

Tools: Docker, Gradle, Git, GitHub, GitLab, Selenium, JMeter, Figma

Cloud Services: AWS, Azure, Kubernetes

Databases: MySQL, PostgreSQL, MongoDB

Testing Tools: JUnit, Postman, BrowserStack

Other: Multithreaded Programming, Microservice Architecture

Professional Experience:

NewDay USA – West Palm Beach, FL Jan 2022 – Present

Software Developer

Project Summary: Led the development of dynamic, user-centric web portals integrating React.js and Strapi CMS, focusing on improving customer experience and operational efficiency. Designed scalable APIs enabling seamless integration with HubSpot and other external platforms. Enhanced marketing analytics through robust Google Tag Manager implementations, delivering actionable insights. Spearheaded compliance initiatives using tools like CookieYes, ensuring adherence to CCPA and SOC2. Successfully restructured website architecture for better scalability, reducing system downtime significantly.

Responsibilities:

Spearheaded full-stack development of websites and portals using React.js, Strapi CMS, and Sitefinity CMS, increasing user engagement by 30%.

Developed custom Sitefinity widgets and modules using ASP.NET MVC framework to enable dynamic and reusable website components.

Built and maintained RESTful APIs to enable seamless backend communication and integration with external systems.

Optimized website tracking with Google Tag Manager (GTM), creating tags and triggers that improved actionable insights by 20%.

Integrated CRM systems like HubSpot for lead submissions, improving data accuracy and tracking by 15%.

Assisted in the development and maintenance of a website and custom CRM solution built on the Salesforce platform Invisr Polystack, providing support for no-code/low-code solutions.

Ensured compliance with regulatory standards like CCPA and SOC2 through tools like CookieYes and Google Consent Mode.

Developed automated workflows for deployment using Azure DevOps and Docker, reducing deployment times by 25%.

Enhanced frontend performance by implementing dynamic components with Next.js and React Hooks.

Collaborated with cross-functional teams to analyze requirements and deliver solutions within Agile sprints.

Created and maintained comprehensive technical documentation to support onboarding, project handovers, and team collaboration.

Redesigned the website architecture to improve scalability and maintainability, reducing system downtime by 20%.

Infosys Limited – Remote, CA Nov 2021 – Jan 2022

Senior Systems Engineer

Project Summary: Contributed to the modernization of legacy systems by implementing microservices architecture, improving scalability and fault tolerance. Played a key role in designing and optimizing RESTful APIs, enhancing system performance and interoperability with third-party tools. Streamlined database operations with MySQL optimizations, significantly reducing data processing times. Collaborated in Agile workflows to ensure timely and high-quality project deliveries. Leveraged Selenium and Postman to automate testing pipelines, reducing bugs and improving deployment readiness. Completed specialized training in Data Warehouse Management Systems, focusing on ETL pipelines and data visualization using Tableau and Power BI.

Responsibilities:

Designed and developed web applications using Java, Spring Boot, and SQL, improving data processing speeds by 40%.

Conducted extensive testing using Postman and Selenium, reducing bugs by 20% before production release.

Built RESTful APIs to ensure seamless data exchange and interoperability with third-party systems.

Streamlined MySQL database operations, optimizing queries and reducing data processing times by 30%.

Created and managed ETL pipelines, ensuring efficient data transformation for enterprise-level analytics.

Developed interactive data dashboards using Tableau and Power BI, enabling real-time business insights.

Collaborated in Agile workflows, ensuring timely sprint deliveries and maintaining high-quality standards.

Improved system fault tolerance by designing and implementing microservices architecture.

Automated end-to-end testing pipelines with Selenium and Jenkins, enhancing deployment readiness.

Delivered training on Data Warehouse Management Systems, emphasizing ETL pipelines and data visualization tools.

Participated in object-oriented analysis and database design, creating robust and scalable applications.

Optimized legacy system modernization efforts by integrating cloud-based solutions and APIs.

Tata Consultancy Services (Client: USAA) – Plano, TX Nov 2019 – Oct 2021

Software Engineer

Project Summary: Managed and optimized a robust CI/CD pipeline leveraging Docker, Jenkins, and OpenShift to automate deployment workflows. Designed API gateways to streamline backend integrations, improving communication across distributed systems. Implemented Kafka messaging systems for asynchronous processing, increasing scalability. Developed user-friendly frontend interfaces using React.js and Material-UI, enhancing customer interaction and satisfaction. Contributed to PostgreSQL database optimization, leading to better query performance and overall system reliability.

Responsibilities:

Built and deployed RESTful API services using Spring Boot, improving data retrieval times by 30%.

Automated CI/CD pipelines with Docker, OpenShift, and Jenkins, increasing deployment efficiency by 40%.

Implemented Kafka messaging systems for asynchronous processing, enhancing scalability and fault tolerance.

Enhanced frontend features using React.js, Redux, and Material-UI, improving user interface responsiveness.

Optimized PostgreSQL databases, reducing query response times and improving overall performance.

Designed and maintained API gateways for seamless backend integration and improved communication.

Developed microservices architecture, ensuring modularity, fault isolation, and scalability.

Wrote comprehensive unit tests and integration tests using JUnit and Selenium, achieving 95% test coverage.

Improved monitoring and alerting capabilities with tools like Grafana, Prometheus, and ELK stack.

Implemented asynchronous messaging systems using RabbitMQ and Kafka, enhancing system resilience.

Designed caching mechanisms with Redis, reducing database load and improving application performance.

Participated in Agile Scrum workflows, ensuring timely delivery of high-quality features.

Assisted in cloud migration by deploying containerized applications to AWS ECS and Azure Kubernetes Services (AKS).

Conducted root cause analysis for production issues, implementing fixes to improve system uptime.

Leveraged Swagger/OpenAPI for API documentation, facilitating seamless third-party integrations.

Advanced Micro Devices (AMD) – Austin, TX Jan 2019 – Oct 2019

Software Engineer Co-Op

Project Summary: Played a pivotal role in streamlining manufacturing workflows by automating backend processes with Python, reducing manual effort by 40%. Built interactive dashboards using JavaScript and Chart.js for real-time performance monitoring. Designed Docker-based containerized environments to replicate production settings, enhancing testing accuracy and deployment consistency. Migrated PHP legacy applications to modernized Docker platforms, improving scalability and maintainability. Documented development processes extensively to support team onboarding and cross-functional collaborations.

Responsibilities:

Developed a PHP-based application to streamline manufacturing workflows, reducing process times by 30%.

Designed and implemented GitLab repositories to improve team collaboration and version control.

Built interactive dashboards with JavaScript and Chart.js, enabling real-time performance tracking.

Automated backend processes with Python scripts, reducing manual intervention by 40%.

Designed and optimized database schemas in MySQL to handle large-scale data processing efficiently.

Conducted performance testing using JMeter, ensuring system reliability under peak loads.

Supported the setup of CI/CD pipelines using Jenkins for automated builds and deployments.

Assisted in the migration of legacy systems to Dockerized environments, improving scalability and maintainability.

Set up a multi-containerized development environment using Docker Compose to mimic the production environment.

Used Linux and Docker to establish a persistent development environment with a static IP, supporting new JIRA issues.

Designed and implemented a Docker-based containerized development environment to replicate production-like conditions for testing and deployment.

Automated the environment setup process using Docker Compose, reducing manual setup times by 50%.

Integrated MySQL and PostgreSQL databases within the container, enabling seamless multi-database testing.

Created comprehensive documentation for the development environment, streamlining onboarding for new team members.

South Texas Archives – Kingsville, TX Jan 2018 – Dec 2018

Digital Archives Assistant

Project Summary: Spearheaded the digitization and preservation of archival records, leveraging Python scripting and database tools to streamline workflows and improve metadata accuracy. Developed and maintained a web-based archival system, integrating metadata standards and optimizing digital resources for public access.

Responsibilities:

Developed custom Python scripts to automate the digitization of physical records and streamline metadata generation, reducing manual effort by 50%.

Utilized FileMaker Pro for database management, creating efficient workflows for organizing, managing, and preserving digital collections.

Designed and implemented ETL pipelines for migrating data from legacy systems to modern digital platforms, ensuring data consistency and reliability.

Enhanced the digital preservation process by developing scripts for metadata extraction, validation, and compliance with national standards (Dublin Core, MODS).

Built and maintained a web-based archival management system using HTML, CSS, and JavaScript, improving accessibility and user experience.

Created automated backup and recovery workflows using Python and shell scripting to safeguard digital records and ensure system reliability.

Optimized metadata quality by designing automated metadata tagging scripts, ensuring consistency and standardization across digital assets.

Developed scripts for batch image processing (e.g., resizing, format conversion) using libraries such as Pillow and OpenCV to prepare archival records for online access.

Implemented version control for digitized assets using Git, improving traceability and collaboration on archival projects.

Monitored and maintained data integrity with periodic validation scripts, leveraging checksum algorithms to ensure digital assets remain unaltered.

Collaborated on the creation of interactive dashboards using tools like Tableau and Power BI to visualize digitization progress and metadata quality metrics.

Provided technical training and documentation for staff on Python scripting, metadata standards, and database management workflows

Education:

Texas A&M University – Kingsville, Kingsville, TX, USA- Master’s in Computer Science (Software Engineering)

Punjab Technical University, Amritsar, Punjab, India- Bachelor’s in Computer Science & Engineering



Contact this candidate