Post Job Free

Resume

Sign in

Java Developer

Location:
Jersey City, NJ
Posted:
November 30, 2023

Contact this candidate

Resume:

Name: Harshini Manthena

Full stack Java Developer

Email: ad1lk4@r.postjobfree.com

Phone: 732-***-****

Visa: GC

Professional SUMMARY:

Over 10 years of professional experience in the IT industry, with a focus on developing, implementing, and maintaining a variety of applications using Java, J2EE technologies, and Object-oriented methodologies. Skilled in enterprise technologies, frameworks, and design patterns.

Certified Scrum master and expertise in delivering projects through Agile and Test-driven development (TDD).

Experienced with J2SE Technologies like API, Threads, Executor framework, Completable Future, Futures, Collections, and Exception Handling and, J2EE Technologies like Servlet, Listener, JSP, and Java Security API.

Strong development skills in Java, J2EE, JDBC, JSP, Servlets, EJB J2EE, JNDI, RMI, HTML, XML, XSL, Java Script, Rational Rose, DB2, Oracle and SQL Server.

Experience writing backend using Node.js with frameworks like Express and MongoDB database.

Hands-on experience with Cassandra database, including data modeling, query optimization, and ensuring high availability of data.

Expertise in the implementation of Core concepts of Java, J2EE Technologies, JSP, Servlets, JSF, JSTL, EJB transaction implementation, Spring, Hibernate, Java Beans, and JDBC.

Stream applications using Kafka APIs and Kafka Streams API. Wrote Producer and Consumer API to publish and consume data from topics respectively.

Ample experience with usage of bundle packages and familiarity with using tools like NPM, and Bower as task runners. Used Karma, Jasmine, and Protractor as UI testing for Backbone JS and React JS.

Implemented retry mechanism before sending to error topic. Implemented multithreaded consumption for slow consumers. Implemented exactly once semantics using Kafka.

Expertise in JDBC Connection Pooling, Persistence, Caching, EJB Server, HTTP, HTTP Tunneling

Good experience and knowledge in various development methodologies like Test Driven Development (TDD), Extreme Programming (XP), Scrum, and Agile.

Proficiency in front-end application development using Angular 2.0/4.0, React JS, Ember JS for dynamic users and which helps in architectural pattern MVC.

Designed, deployed, and managed applications on Google Cloud Platform (GCP).

Utilized GCP services such as Compute Engine, Cloud Storage, and Cloud Functions to build scalable and resilient solutions.

Using Kotlin for implementing new modules in the application.

Have knowledge of Kotlin Android Extensions framework.

Over 3 years of hands-on experience in implementing Master Data Management (MDM) using TIBCO EBX, with a strong focus on building custom REST APIs within the EBX framework.

Proven proficiency in MDM foundational and data modeling concepts, translating complex business requirements into effective technical solutions.

Experience in working with containerization technologies like Docker, Kubernetes, and OpenShift.

Worked with containerization technologies, such as Docker, to package and deploy applications consistently across different environments.

Orchestrated containers using Kubernetes for efficient deployment, scaling, and management.

Experience in designing and architecting using UML-based diagrams through tools like Plant UML and Lucid charts.

Worked on customized front-end application development using jQuery, React JS, and Handlebar JS and implemented React JS using the Redux library and Flux pattern.

Extremely good in Spring Boot, Spring Framework, Hibernate, Angular 8, React JS, TypeScript, and JUnit frameworks.

Proficient in using Enterprise integration patterns (EIP) with Apache Camel such as Multicast, dynamic router, content-based router, splitter, and recipient list.

Extensively worked on Collections, Generics, Enumerations, Annotations.

Conducted in-depth analysis of system performance, identifying bottlenecks and areas for improvement.

Proficient in using performance monitoring and profiling tools to assess and optimize system performance.

Demonstrated ability to optimize code for improved efficiency, reducing response times and resource utilization

Have knowledge of Spring Cloud to develop Spring Boot-based Microservices interacting through REST.

Expertise in Object Oriented Analysis and Design (OOAD), OOPS using Unified Modeling Language (UML), Design Patterns, and MVC Frameworks.

In-depth knowledge of Apache Subversion (SVN), Git & Bit Bucket, and Jenkins Continuous Integration Server – Installation, Configuration, Design and Administration, and integrating these tools with other systems.

Experienced in working with Amazon Web Services like EC2, S3, AWS CloudWatch, Dynamo, SQS, Lambda, and SNS.

Familiarity with big data processing frameworks like Apache Spark and Apache Flink.

Integrated Azure Cosmos DB seamlessly with other Azure services, such as Azure Functions, Azure Logic Apps, and Azure App Service, to create end-to-end solutions.

Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning, and advanced data processing. Experience optimizing ETL workflows.

Proficient in Service Oriented Architecture (SOA), Experienced in the development and use of Web Services.

Proficient in using caching technologies like Hazelcast and Redis and integration of those into applications.

Worked on various J2EE applications on app servers such as Weblogic10.3, Websphere, JBOSS Fuse 6.1 and Tomcat.

Experience in writing JUnit tests using Mockito, Power Mockito, and behavior-based tests using Spock and Cucumber.

Experience in front-end UI technologies like HTML5, CSS3, JQuery, JSON, AJAX, Node JS, AngularJS, BackboneJS, Bootstrap, Tag Libraries, and JSTL.

Expertise in creating single-page applications (SPA) and reusable components in Angular 6/Angular 10.

Experience in implementing e-commerce/distributed applications using HTML, HTML5, CSS, JavaScript, Java, J2EE, Servlets, JSP, Java Beans, JDBC, EJB, XML, XPATH, JAXB, JAXP, SQL, jQuery, Unix, Linux and Windows.

Designed and implemented XML schemas, Java APIs, business logic, and XML/JavaScript user interfaces.

Extensive experience with developing web and enterprise applications with development tools like Eclipse, IntelliJ, and WebLogic.

Extensive experience in developing unit testing frameworks using Junit and test-driven methodology.

Experience in building projects through Maven and ANT building systems.

Proficient in Core Java concepts like Multi-threading, Collections, and Exception Handling concepts.

Experience in version control tools like SVN, GitHub, and BitBucket.

TECHNICAL SKILLS:

Programming Languages

Java, J2EE, JDBC, Shell Scripting, Python, JavaScript, TypeScript, C, C++, jQuery,

HTML5, DHTML

Frameworks/Libraries

Apache Camel, Spring, Spring-Boot, Angular > 4, React JS, Apache Spark,

Flask, Django, Bootstrap, Dozer, YARN, Express.

Java Enterprise API

Servlets, JSP, Junit, EJB, JNDI, JSON, JMS, JDBC, Java Mail, RMI,

Web services

Messaging Technologies

Apache Kafka, IBM MQ, Rabbit MQ, ActiveMQ, IBM WebSphere and JMS

System Design

Docker, Kubernetes, Openshift, MVC, Spring, Spring Boot, Hibernate,

CSS3, Microservices, Node.JS, Reactive, and Event-driven systems.

Databases& Programming

MySQL, SQL, MongoDB, NoSQL, Oracle, SQL Server, IBM DB2, Cassandra

Stored Procedures, PostgreSQL, AWS Dynamo, AWS Aurora.

Software Engineering

UML, Design Patterns, Object Oriented Methodologies, Service Oriented

Architecture, Test Driven Development, Scrum, and Agile methodologies

XML Technologies

XML, DOM, SOAP, WSDL

Application Servers

Apache Tomcat, Glassfish, Jenkins, JBoss, WebLogic, IBM, Apache Karaf

WebSphere, Eclipse, Maven.

IDEs & Tools

Eclipse, IntelliJ, VS Code, WinSCP, Putty, Jenkins, ANT, Maven, Log4J,

Splunk, DataDog, Grafana, Websphere Studio Application Developer

PROFESSIONAL EXPERIENCE:

Client: Honeywell, Boston MA Jan2023 – Till date

Job Title: Full stack Java Developer

Responsibilities:

Designed for breaking the S3 monolith by separating encryption process of the storage system into a microservice handling more than 98% of traffic coming to S3.

Designed the requirements and implementation strategy using PlantUML and Gliffy and presented it for review with senior engineers and principal engineers.

Design and implement micro-services API for mobile/web front-end and back-end edge points. Good Knowledge of Microsoft Azure Cloud.

Demonstrated expertise in working with Azure Cosmos DB, Microsoft's globally distributed, multi-model database service.

Designed and implemented effective database models within Azure Cosmos DB, considering scalability, performance, and data distribution requirements.

Utilized Azure Cosmos DB's multi-model capabilities, including document, graph, and key-value data models, to address diverse data storage needs.

Implemented GraphQL APIs for efficient and flexible data querying.

Utilized SPARQL to query and manipulate data in RDF (Resource Description Framework) format for graph database systems.

Implemented and optimized global distribution features of Azure Cosmos DB to ensure low-latency access and high availability across multiple regions.

Designed multiple interfaces and adapters to integrate various APIs like GET, PUT, COPY, LIST, and DELETE with a new encryption module.

Developed robust RESTful web services in Typescript, Node.JS, and Angular in an agile environment using continuous integration using GitHub Actions.

Developed publisher and consumer services to pull messages from Kafka Topics.

Visual Studio Code was used as an IDE for development with PostgreSQL as the Database.

Wrote algorithms to serialize and deserialize encryption module output refactored out of S3.

Wrote checksum logic to increase durability guarantees of operations for all APIs using thread-safe logic without impacting performance of operations.

Extensively used Spring JDBC and Sequelize ORM template for performing Database Transactions.

Refactored various implementations of encryption using factory design patterns and used them to process object bytes coming to S3.

Implemented DevOps practices to streamline development workflows and improve collaboration between development and operations teams.

Proficient in using Terraform for infrastructure as code, automating the provisioning and management of resources on cloud platforms.

Experience in RDBMS such as Oracle, and SQL Server and writing stored procedures triggers, cursors, and optimizing queries using SQL.

Utilized decorator design pattern to wrap encryption module responses from encryption microservice.

Wrote database interaction code and used JDBC API to connect MySQL.

Wrote a blob processing module to process object blobs which included writing a new Iterator calculating range and part for GET API.

Led the development of architecture, design, and implementation strategies for various master data domains using TIBCO EBX.

Documented and promoted best practices for TIBCO EBX implementation, ensuring adherence across the engineering team.

Contributed to cross-team collaboration with other technology teams and architects to define and develop data architecture.

Analyzed CPU utilization, memory usage, network performance, garbage collection, and database parameters to assess and enhance application performance.

Provided technical assistance to improve system performance, capacity, reliability, and scalability, ensuring optimal functionality.

Conducted root cause analysis of performance issues, identifying key factors and recommending corrective actions for sustained improvements.

Oversaw the system performance lifecycle, identifying and establishing key metrics to measure and enhance performance improvements.

Evaluated system performances, providing insightful recommendations for enhancements and optimizations to meet performance goals.

Proficient in APM (Application Performance Management) tools such as Dynatrace and AppDynamics to monitor and optimize application performance

Involved in writing JSP Custom tags and JSF components. Used JSTL tag library Core, Logic, Nested Bean, and HTML tag libraries to create standard dynamic web pages.

Performed integration of the above two refactored modules enabling it to take 2% of customer traffic. Used Command and Strategy pattern to implement that integration.

Used log4j logger to log errors and info across the application with proper exception handling.

Wrote new unit test cases using Mockito and Power Mockito to improve coverage of classes to 97%.

Wrote new behavior-based end-to-end test cases in Spock and Cucumber to improve testing coverage from 69% to 93%.

Successfully utilized Agile methodology to deliver features resolving dependencies and using CI/CD pipelines.

Set up infrastructure to simulate customer traffic and workflows to perform performance testing for new implementation.

Automated a lot of frequent log dives by writing shell scripts based on log patterns.

Worked collaboratively with other developers in a team-based environment, utilizing version control tools to manage changes to the code base.

Environment: Java 1.8, Typescript, Nodejs, JDK, Log4j, J2ee, JDBC Lombok, Spock, Cucumber, Mockito, Power Mockito, Functional programming, design patterns, ANT build, IntelliJ, GIT, SOA, JMS, SOAP, XML, Eclipse, RESTful Web Services, WebSphere, Microservice architecture, awk and shell scripts.

Client: Lincoln Financial Group, NC Jan 2022 – Dec 2022

Job Title: Full stack Java Developer

Responsibilities:

Worked on Spring-Boot application using JAVA 8 that aggregates data and metrics for 16 types of ECHO devices.

Developed REST APIs with spring-based transactions to use the Oracle database to fetch device info and process those requests on EMR clusters.

Implemented MVVM architecture using Redux Architecture with React JS.

Developed data ingestion application to bring data from the source system to HBase using spark streaming, Kafka.

Used multithreading concepts and Executor framework to manage thread pools to run 200-230 Amazon Athena queries to collect performance percentiles.

Created JUnit test cases using Mockito to provide 98% test coverage and used Sonar to identify bugs and check style issues.

Configure Window Failover Cluster by creating Quorum for File sharing in Azure Cloud.

Configured and managed consistency models in Azure Cosmos DB based on application requirements, balancing between strong and eventual consistency.

Created comprehensive documentation for Azure Cosmos DB configurations, best practices, and guidelines to facilitate collaboration and knowledge transfer within the team.

Involved in the design Implementation of JSP, Servlets, and Web Development

Involved in the Parsing of internal XML format documents to retrieve the information and pass them to the Struts Action class for further processing.

Developed various screens for the front end using React JS and used various predefined components from NPM and Redux.

Developed single-page applications using Performed Real-time event processing of data from multiple servers in the organization using Apache Storm by React Redux architecture, ES6, web pack, and grunt.

Closely worked with Applications using React JS and Node.js libraries NPM, gulp directories to generate desired view and flux to root the URLs properly.

Expertise in Microsoft Azure Cloud Services (PaaS & IaaS ), Application Insights, Document DB, Internet of Things (IoT), Azure Monitoring, Key Vault, Visual Studio Online (VSO) and SQL Azure.

Worked on some legacy web services built on Apache-CXF running on Apache Tomcat.

Used JAXB to process JSON-based responses from RESTful Web Services external to application to collect driver metrics.

Used Spring JDBC template and hibernate for performing Database Transactions.

Worked on building the front end using React JS, HTML, and CSS that helps parse device logs and generates insights for memory and CPU logs that are uploaded through a web portal.

Used AWS lambda, AWS Cloudwatch, and AWS SQS to create an email notification system which creates an email whenever a certain Cloudwatch event is alarming with log analysis details in an email.

Used Node.JS to develop comments repository which automatically triggers test runs interacting with MongoDB.

Used Ant tool to build the application and Websphere Application Server WAS6.0 to deploy the application

Maintained schedules for Data warehouse storage. Read and interpreted UNIX logs.

Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.

Identified bottlenecks across various layers of application stacks, implementing solutions to optimize and streamline system performance.

Applied experience in using code profilers and analyzing profiling snapshots to pinpoint areas for improvement in application code.

Demonstrated knowledge of application server and database middleware, including tuning strategies to enhance overall system performance.

Experience, dealing directly with equity products, and developing in Java or Scala.

Created AWS IAM roles and policies to create permissions for AWS resources.

Used AWS S3 to store compressed log files and insights pdf generated for future reference. Generated pre-signed URLs to access those.

Created SQL queries, indexes, triggers, and sequences for AWS Athena to fetch results.

Created Apache Spark jobs that run on cluster of Linux machines to stream performance logs into AWS Athena.

Used CI/CD pipelines to build and release features and update libraries to minimize security risks.

Maintained and optimized the performance of web applications, identifying and addressing bottlenecks as necessary.

Environment: Java, Spring-Boot, J2EE (JEE), Apache Spark, AWS Athena, AWS Lambda, Web Services, AWS SQS, React, JavaScript, GIT, REST, JAXB, AWS S3, JMS, VSCode, AWS IAM, WebSphere, Eclipse, AWS Cloudwatch, Sonar, Mockito, PowerMockito, Node.JS, Linux

Client: Prime Therapeutics, MN Sep 2018 – Dec 2021

Job Title: Java/J2EE Developer

Responsibilities:

Developed a data ingestion application to load real-time data coming from locomotives into Oracle 10g database using Apache Camel framework with Spring Boot Integration.

Developed we application reporting locomotive status using JSF, Hibernate, and J2EE technologies to access an Oracle RDBMS 13g RAC database in a multi-developer, configuration-controlled environment.

Development of complex web application - Confidential utilizing a wide range of open source and Oracle technologies: Oracle RDBMS, Spring MVC, J2EE/JEE, servlets, JSP, JSF, JavaScript, Hibernate, WebLogic, Web Center and ADF.

Developed three microservices using Spring boot to categorize three sets of request type processing based on data source and request structure. All microservices exposed REST endpoints.

The middleware interaction used the JMS/IBM WebSphere MQ series for transferring messages between different components using the JMS/Mail API framework.

The data rate was 4 million messages per day serving 50-52 requests per sec using IBM MQ and event-driven architecture.

Communicated with external applications JMS messages using IBM Websphere MQ.

Utilized JBOSS IDEs for application server environments that included JBOSS AS > 5.0 and JBOSS EAP.

Used Service-Oriented architecture (SOA) principle and OSGI to expose services for decoding, transforming, and storing according to data source, type, and use cases.

Used Scala collection framework to store and process the complex consumer information

Used Enterprise Integration Patterns like multicast, content-based-router, dynamic router, recipient list, and splitter to transform messages.

Extensive experience as a Production Support personnel in various multiple Data warehouse Projects and extensively worked with offshore teams.

Project Setup using MyEclipse, IntelliJ, and servers like Tomcat, JBoss

Deployed all OSGI web services using FUSE platform 6.1 (Apache Camel, Blueprint, IBM MQ, Karaf/OSGi container)

Experience in utilizing and implementing Confluent Schema Registry with Kafka.

Implemented Spring boot microservices to process the messages into the Kafka cluster setup.

Used caching mechanisms like Hazelcast to cache intermediate states store results and reduce reads on the database. Explored differences between Hazelcast and Redis before designing the application.

Developed a Spring Boot application to collect a stream of real-time panel images from locomotives and display them on the web portal with a refresh rate of 3 sec. The system supported around 2k locomotives at a time.

Eclipse was used as an IDE for development with Apache Tomcat as the server

XML was used to create the db schema-mapping file for Hibernate.

Developed JSPs including AJAX that call different APIs that process messages using XML.

Reviewed code and documented designs to drive ETL processes using RedHat technologies for real-time data ingestion.

Used log4J for logging, Junit4 for unit tests and JDK8 for development.

Used JDBC to retrieve data from the Oracle database.

Educated both engineering teams and business stakeholders on TIBCO EBX software capabilities and limitations, fostering a collaborative understanding.

Collaborated closely with the EBX Architect to define technical architecture and data integration patterns for seamless EBX implementation.

Wrote multiple data ingestion queries and used indexes, and stored procedures to facilitate the data ingestion process as a configuration.

Assortments, Products to Apache Kafka Topic by using custom Serializers. Exposed the endpoint for Swagger and developed API’s for documenting RESTFUL Web services.

Maintained SQL queries in configuration files to perform database transactions during the data ingestion process.

Utilizing big data technologies such as Hadoop, map-reduce frameworks, mongo dB, hive, Oozie, Flume, sqoop, and talend, etc.

Used HAWTIO dashboards to monitor logs and Karaf container's health.

Used Sonar, and Jenkins CI/CD to drive agility and quality in the development process.

Used Splunk dashboards to write queries and detect anomalies from logs to drive operation excellence.

Designed and implemented business logic for tiered applications, incorporating JSF (1.2 – 3.0), EJB (3.0 - 3.2), and JSP (2.0 - 2.3).

Environment: Java/EE1.8/1.7, Spring, Spring-Boot, JBOSS Fuse 6.1, Karaf 2.2, Apache Camel, Active MQ, SQL, Oracle 11g, Maven, Jenkins, Sonar, GIT, Spring Boot, Spring, Blueprint, Hazel cast, Redis, JBOSS Developer Studio.

Client: Doordash, San Fransisco CA Aug 2016 – Aug 2018

Job Title: Java Developer

Responsibilities:

Written complex SQL queries, Stored Procedures, and Functions in PL/SQL for manipulating the data.

Worked on Lambda Expressions, Functional interfaces Stream APIs, Time APIs, and Improvements on Collection, Concurrency, and IO improvements by using Java 8.

Used Subversion for configuration Management and Jira for task management and bug tracking.

Used SOAPUI to test for sending and receiving XML data and worked with JMS Queues for sending messages in point-to-point mode communication.

Used multithreading for writing the collector parser and distributor process, which was getting real-time data from Zacks API in the format of JSON, using multithreading improved the performance by a lot. Moreover, using a concurrency package of collections made it thread-safe.

Used Message body Writer for converting Java types streams.

Used Maven for compiling and building the code.

Used JavaScript, HTML, and JSP pages for developing front-end UI and wrote application-level code to perform client-side validation.

Used HTML5 wireframes with CSS provided by the design team. JS is used to make it dynamic

Used AJAX and JavaScript for Client-side validations.

Provide 24x7 support to the application in pilot and production phases. Support included being on conference calls, identifying and fixing bugs, and investigating reasons for specific application behavior.

Performed unit testing using the JUNIT framework and tested DAOs and Business Services

Migrated technology from Angular 1.0 to Angular 2.0 to use upgraded features such as Angular Components and Angular Routers as per the strategy requirement.

Developed Servlets for server-side transactions and made use of AJAX for server-side processing without refreshing the JSP page.

Experience in generating Reports and Dashboards on Dynatrace and Splunk

Experience in implementing MongoDB CRUD (Create Read Update Delete) operations by using the Mongoose library in Node-JS including Angular JS.

Extensive professional experience in Developing and Deploying enterprise applications on web/application servers such as JBoss EAP 5.1, Tomcat 5.x/4.x, IBM WebSphere 6.x/7.x, Web Logic under Windows OS and UNIX

Extensively used Jenkins as a Continuous Integration tool to deploy the Spring Boot with Microservices to Pivotal Cloud Foundry (PCF) using build pack.

Implemented AngularJS Controllers to maintain each view data. Implemented Angular service calls using Angular Factory with Dependency Injection to prevent scope conflict commonly found with JavaScript.

Implemented lightweight WADL (Web application description Language) for better understanding of Rest-based web services and their configuration.

Implemented multi-threaded synchronization processes, with JMS queues for the consumption of Asynchronous requests.

Involved in bug fixing during the System testing, Joint System testing, and User acceptance testing. Deploying the applications and binding third-party services like AppDynamics on Pivotal Cloud Foundry (PCF).

Developed application using Spring JPA, and Angular 2.0 on the presentation layer, the business layer is built using Spring and the persistent layer uses Hibernate.

Developed and implemented Restful Web APIs, and exposed endpoints using HTTP methods like GET, PUT, POST, and DELETE.

Designed new classes and functionalities using various JQUERY components for CRM applications for customer service.

Deployed our application on Pivotal Cloud Foundry (PCF) which is used to reduce the development overhead by providing a ready-to-use platform.

Created Web User Interface (UI) using HTML5, DHTML, table-less XHTML, CSS3 and Java Script that follows W3C Web Standards and are browser compatible.

Configured Bamboo to handle application deployment on Cloud (PCF) and to integrate with GitHub version control.

Built Java Security Aplite adds security and authentication to my application.

Environment: Angular JS, HTML5, CSS3, AJAX, Bootstrap, JSON, XML, Active MQ, JMS, Hibernate, DB2, SOAP-AXIS2, Restful services, JAX-RS SOA, Eclipse Java EE IDE Neon.3, Git, Log4j, DB2, Maven, TestNg, WADL, PCF.

Client: Sparsh Technologies, India June 2013 – Feb 2016

Job Title: Associate Software Engineer

Responsibilities:

Developed UI screens using JSP, HTML, CSS, and JavaScript.

Worked on JSP, Servlets, Struts framework, and production support issues of the existing applications.

Development of Action Forms, Action Servlets, Action and Validating Action Forms in Struts framework.

Implemented Struts Dispatch Action class and form bean classes using struts framework.

Client & server validations were handled using JavaScript & Struts validate plug-in.

Worked on the JAVA Collections API for handling the data objects between the business layers and the front end.

Implemented Multithreading for handling multiple requests and for high performance.

Created many stored procedures and scheduled jobs to support our applications to create reports for customers.

Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.

Develop a business continuity plan for the SQL Server Databases using JDBC drivers.

Used spring for Dependency Injection for plugging in the Hibernate DAO objects for the business layer.

Created an XML configuration file for Hibernate to map to SQL DB.

Developed web services for sending and getting data from different applications.

Used JDBC to access the Oracle database for accessing customer information.

Used ANT-built tool for compiling and generating war files.

Environment: Core Java, HTML, CSS, J2EE, JSP, HTML, JavaScript, Servlets, JMS, Hibernate, JDBC, SQL, DAO, Web Services, Oracle, ANT.



Contact this candidate