Satyanarayana J
***************@*****.***
Sr. Java/J2EE /Big Data/NoSQL projects.
Objective:
To be associated with software development using Latest Technologies and to work in a challenging environment which enhances my professional skills and enables me to contribute to the growth of the organization.
Professional Summary:
12 + years of software lifecycle experience in Application Software Design, Object Oriented Design, Development, Documentation, debugging and Implementation.
4 + years of experience as Technical Lead for java j2ee projects and Interaction with multiple teams and coordination with project stakeholders.
4+ years of experience in Bid Data technologies Hadoop, HDFS, Map Reduce, Hive, Sqoop, Cassandra, Elastic search.
Experience on full software development life cycle (SDLC) starting from collecting business specifications, user requirements, confirming the design decisions regarding data, process, interfaces, reviewing/audit the code and documenting the entire life cycle.
Extensive experience in Application Software Design, Object Oriented Design, Development, Documentation, Debugging, Testing and Implementation.
Solid background in Object-Oriented analysis and design.
Solid background in Object-Oriented design principles.
Good knowledge on Data migration from lower version to higher version of Cassandra 1.2.5 -> 2.0.9 -> 2.9.14.
Good knowledge on Cassandra dev environment set up with multiple 2 or 3 node clusters.
Specialized in J2SE, J2EE, SERVLET, JSP, JSTL, Custom Tags, STRUTS, SPRING, Spring boot, EJB, JDBC, JNDI and Hibernate.
Experienced in developing web applications in various domains like Telecom, Health Care, Banking, Retail and Public sector domains.
Having a strong background and experience in developing the software applications, experienced in building complex, sophisticated products with ambitious time schedules.
Having good knowledge of Oracle as Database and excellent in writing the SQL Queries.
Developed the web applications on IDE tools like WSAD 5.1, NetBeans 5.0, Eclipse 3.2 and Myeclipse 10.1.
Expert level skills in Designing and Implementing web servers solutions and deploying Java Application Servers like Websphere, Weblogic, Tomcat, JBoss on Unix/Windows platforms.
Good understanding of Design Patterns like Singleton, MVC, Session Façade, DAO, Factory.
Good understanding of Design Patterns like Creational Patterns, Structural Patterns & Behavioral Patterns
Experience in optimization of SQL scripts used in stored procedures.
Very good understanding and working experience of Software Development Life Cycle (SDLC).
Strengths include good team player, excellent communication, interpersonal and analytical skills, flexible to work with new technologies and ability to work effectively in a fast-paced, high volume, deadline-driven environment.
Experience on various databases like Oracle and MySQL.
Working experience in Version control tools like CSV and SVN, Git & SourceTree.
Proficient in XML technologies like XML, DTD, XSL, XSLT.
Experience on different operating systems like Windows 2000/XP/7 and UNIX.
Hands on Experience in Agile - Scrum Software development life cycle.
Expertise in testing tracking tools like Bugzilla, Rally.
Good knowledge at performance testing tool like JMeter.
Good at Communication and writing skills.
Experience in using messaging services like JMS and TIBCO
Developing system documents like Functional Specifications, Technical Specifications and User Manuals etc.
Good knowledge at Java object model design and development and association.
Certifications:
oSCJP 1.5 (Sun Certified Programmer for Java Platform).
oSCWCD (Oracle Certified Professional, Java EE 5 Web Component Developer).
Technical Skills:
Presentation Tier
Technologies
Client Scripting
HTML/HTML5, DHTML, CSS, JavaScript, AJAX-JSON.
Client JavaScript F/Ws
JQuery, JQueryUI.
Templating
Velocity.
Tools
Business Tier
Technologies
Software Processes
Water fall, Agile, Test Driven Development.
Server Side Scripting
JSP, Struts Tag Library, Spring Tag Library, JSTL Tag Library, Tag Library, Custom Tags
J2EE Web Frameworks
Struts, Struts-Tiles, Struts-Validator, Spring, Spring boot
XML Technologies
XML, DTD, SCHEMA.
Java & XML
SAX, JAXP, DOM, JDOM.
Webservices Technologies
SOAP, RESTfull, WSDL, UDDI, JSON, JAXB
Webservices Tools
SOAPUI,XML Spy, Postman
Webservices Implementations
Axis 1.0, Axis 2.1, JWSDP (Java Web Services Developer Pack) Jersey.
Webservices Specifications
JAX-RPC, JAX-WS, JAX-RS
AOP
AspectJ, Spring AOP
Design Patterns
Creational Patterns
Factory, Abstract Factory, Builder, Factory Method, Prototype, Singleton.
Structural Patterns
Adapter, Decorator, Façade, Composite, Bridge,
Behavioral Patterns
Iterator, Template, Chain of Responsibility, Strategy, Command, State
J2EE Patterns
MVC, Business Delegate, DAO, Front Controller, Service Locator, Transfer Object.
Loggers
Log4J
ORM Frameworks
Hibernate 2.0, 3.0, JDO, JPA, Toplink
Email API
JavaMail.
Batch Scripting
ANT, Maven & Gradle
Messaging service and Tools
JMS(Tibco, ActiveMQ), Gems, Kafka
UML Tools
MS-Visio, Rational Enterprise.
Tools
XML Tools
Altova XML Spy.
Bug Tracking Tools
Rational ClearQuest, JIRA & Rally
Database Tier
Technologies
Databases
Oracle 7,8i,9i,10g,11g, SQL-Server 2000,2005, Sybase, MySQL 4.2,MS-Access, DB2 & Liquibase
Tools
Query Browsers
MySQL Query Browser, Toad for Oracle, Toad for SQL Server, PL-SQL Developer.
Profiling Tools
SQL Server Profiler.
Platforms
Windows
Windows 98, Millenium, XP Professional, Windows Server 2000, 2003.
Unix
HP-Unix 3000,Linux 7.2,8.0
Environments
Programming Languages
J2SE 1.2, 1.3, 1.4, 1.5, 1.6,1.7 & 1.8
IDE’s
[ Development, Comaprison, Merge Tools]
STS 3.2, Eclipse 3.3, Jdeveloper 10.0, IntelliJ 4.0, 5.0, IBM WSAD, Netbeans, EditPlus.
Unix editors
vi.
Unix shells
Bourne, csh, ksh.
Webservers
Tomcat 5.0.28,5.5,7.0,7.0.35 & 8.5.15
Application Servers
Jboss 4.2, Weblogic 7.1, 8.1, 9.1, 10.0, Websphere 5.0, 6.0, Apache, Sun Application Server, OC4J.
Versioning Tools
SNV,CVS, Visual SourceSafe, Git & SourceTree
FTP Tools
FTP-32, WinSCP, Putty, Filezilla
Big Data/ Hadoop
Hadoop ecosystem Hadoop(2.7),Hue(2.6), hive(1.2), Cassandra 2.1.14, Hbase(1.1),
beeline(1.2), Sqoop(1.4), Impala, HDFS & Cassandra, zookeeper
Distributions
Cloudera CDH, Apache Hadoop, HortonWorks(2.5.3.0-37)
VM
Oracle Virtual Box, Cloudera & DevOps
Testing
Technologies [WhiteBox Testing]
Testing APIs
Junit. JMeter 10.2 & Post Man client, Fitnesse, Soup-UI
Code Coverage
Cobertura & EclEmma - JaCoCo & SonarQube
Professional Experience
Verizon Wireless Aug 2017 – Till Date
Projects
EFHVA, DVT & File Compare
Client
Verizon Wireless
Role
Sr. Java/J2EE & Big Data Consultant
Duration
Aug 2017 – Till Date
Environment
Java 1.7 & 1.8, Code cloud, HDP, HDFS, Hive, Scoop, Ambari, Hue,DevOps, Hortonworks, Spring STS IDE, restful web services, Git repository, Jenkins, WinSCP, Json Script, Putty, Unix script, Maven build tool, JIRA, Agile methodology, Junit & SonarQube, Hibernate, JPA, Spring MVC, Spring boot, Rest API & Swagger API for Rest, Microservices -Jenkins Docker container, No SQL Cassandra cache
EFHVA, DVT & File Compare
For each project failed run, the microservice will notify the driver with batch of all fails and notify when inputs from other APIs to track Jira ticketing mechanism. Data validation Tool is an in-house developed framework for automating manual validations, Dynamic Data validation engine that objectively and efficiently measures the quality of data, The framework can be utilized in Dev unit testing, QAT, UAT and Prod validations & The automated test case generation and execution will significantly reduce the testing time and to capture potential business risk.
.
Responsibilities:
oAnalysis, Designing, Implementation & Maintenance of the java applications for DVT JIRA(EFHVA) and file compare.
oDesign and development of middle layer using Spring framework
oDesign and development of rest API using Spring boot and Swagger UI for test APIs
oDesign and development of data access layer using Hibernate
oGenerate script for ingestion job to load data into HDP Lake systems
oWrote the java API to load data from HDFS file system to local to validate the data between file and hive tables data combination of Row key and non-primary keys with whole column values.
oWrote the shell script to validate data between source files to hive tables and created external hive tables to validate or compare data between hive tables and HDFS
oCreated hive external meta data tables to read source files information to use as input parameters to pull the files from HDFS location and unix location details.
oCassandra DEV environment set up for each project.
oInvolved on bulk uploads on Cassandra clusters to capture the performance metrics.
oValidating the health check using Rest API end points which is provided by Spring Boot actuator feature.
oConfigured/Created the Cassandra DEV servers in DEV environment and modified Cassandra configuration files like Cassandra.yaml files to defined JMX, Thrift, Default ports and host and cluster names etc.
oWrote the CQL script for create the Cassandra keyspace/column families and column families with difference replication factors specific to project.
oDesign and development of batch processing components using Spring Batch framework
oDesign and development of source code quality management rules using SONAR
oContinuous integration strategy implementation using Maven and Jenkins tools
oDesign and development of ORACLE Stored Procedures and functions
oTechnical specification document preparation and code review
oRequirements estimation using Agile poker planning technique
oDeveloping Java/j2ee applications to support the Big data connection to pull data from Source tables to HDP Stage tables to store high volume of data and support the robust read/write operations without failure.
oDeveloped Centralized validation tool interface to see the whole compare data validation meta data to generate the report to business users.
oConfigured the source tables to auto populate compare fallouts in File compare UI to check the filed and entire row level data difference to validate.
oDeveloped Java Interface to pull data from different source like share point or excel sheets which is available in Unix box or any other location to publish data into user interface.
oDeveloped Java Interface to read data from HDFS using Kerberos Auth and UNIX raw and mount path to validate data between flat file and hive tables
oData comparison between hive tables and create the external hive tables and compare.
Professional Experience
AT&T Aug 2014 – July 2017
Project
Device Status (IDS/DLC), EENPS, Profanity Filter, Black-flag, IEEUP & Myatt
Client
AT&T
Role
Sr. Java/J2EE & Big Data Consultant
Duration
Aug 2014 – July 2017
Environment
Java 1.7 & 1.8, Code cloud, Apache Cassandra, HDFS, Hive, Scoop & Odysseus API, devCenter, CQLSH, DevOps, Oracle Virtual Box, Kepler Eclipse, Spring Tool Suite restful web services, Odysseus API, Git repository, Jenkins, swm package,WinSCP, Json Script, Putty, Unix script, Maven build tool, JIRA, ActiveMQ, tibco server SolAdmin, Agile methodology, Junit & SonarQube, Kafka, Hibernate
Device Status (IDS/DLC), EENPS, Profanity Filter, IEEUP, Black-flag & Myatt
GRID API & Caches is a subset of the GRID Repository. This subset contains the APIs, In Memory and Persistent Caches that are delivered/exposed to the clients of the GRID Repository
Migrating the all the relational data to highly performed cache systems. Provides a highly available and aggregated copy of AT&T data Accessible. 24x7x365 Intra data center and inter data center redundancy Replicated copies of data across multiple geographic locations enables proximity access for clients. Data available in multiple technologies to serve business functionality using best suited option. Code/platform diversity to mitigate risks. Road map to have data cached in memory for even faster retrieval standardized Interfaces for Data Access. In example InquireDeviceStatus retrieves wireless device status information. It currently provides indication whether a wireless device has been blacklisted (i.e. reported as stolen). It will also report if a device has been unblacklisted which had previously been blacklisted Re-implement the IDS interface using the GRID L3 cache rather than the GRID L2 cache which is used by the current IDS implementation. Targeted at providing an instant price quote for Products for wholesale customers. The basic pricing tables are stored by the Pricer system and are called API and Imported into GRID from Pricer. These are stored in GRID-AC in 3 Column Families as defined by the GRID-AC Grid Data Model Schema. And data populating into Cassandra cache.
Responsibilities:
oAnalysis, Designing, Implementation & Maintenance of the java applications for Device Status (IDS/DLC), EENPS, Profanity Filter, IEEUP, Black-flag & Myatt.
oDesign and development of middle layer using Spring framework
oDesign and development of data access layer using Hibernate
oDesign and development of batch processing components using Spring Batch framework
oDesign and development of source code quality management rules using SONAR
oContinuous integration strategy implementation using Maven and Jenkins tools
oDesign and development of ORACLE Stored Procedures and functions
oTechnical specification document preparation and code review
oRequirements estimation using Agile poker planning technique
oDeveloping Java/j2ee applications to support the Big data and No-Sql technologies to store high volume of data and support the robust read/write operations without failure
oDeveloped java API’s to load data into multiple Cassandra nodes from source systems like TLG and DTI feed data, DLC, Pricer-D, HDS-IMEI etc.,
oCassandra DEV environment set up for each project/solutions, Cassandra version upgrade and data migration from lower version to higher version of Cassandra like 1.2.5 -> 2.0.9 -> 2.1.14.
oInvolved on bulk uploads on Cassandra clusters to capture the performance metrics.
oInvolved writing code using Spring DAO and hibernate to support the few Myatt applications.
oConfigured/Created the Cassandra DEV servers in DEV environment and modified Cassandra configuration files like Cassandra.yaml files to defined JMX, Thrift, Default ports and host and cluster names etc., for each Myatt applications.
oWrote the CQL script for create the Cassandra keyspace and column families with difference replication factors specific to Myatt application
oInvolved code migration from Kundera API to Odysseus API end to end development.
oCreated new project setup for eensp using Maven archetype and deployed into code cloud as central repository.
oInvolved code migration from Kundera API to Odysseus API end to end development and testing
oWorking very close to different teams like QC ENV and PROD & testing team to fix the QC & prod issues.
oWrote the test APIs/Client for testing team to validate all test scenarios for 3 environments like DEV, QC & PROD Wrote the REST Web Services using Axis2 for Myatt application.
oInvolved UI publishing data call to Myatt application using Spring MVC module from Cassandra DB.
oWrote the shell script for start and stop initial and delta listeners to download zookeeper configuration files form SVN.
oCreating the Jar files through SWM deployment and deploying into dev server.
oSharing the Data model jar files putting into maven central and sharing to Client application for reuse our schema into their API’s
oInvolved in Maven build activities like creating the jars and include / excluding dependencies jar versions
oWorked on Data migration.
oPreparing release notes and Deployment instructions wiki pages for each release and each deployment for QC and Prod and walk through with diff teams before prod deployments.
oImporting and exporting data into HDFS and Hive using Sqoop
oUsing exclusively Rally defect tracking tool to update and create al l user stores and metrics before and after deployment and updating Myatt app station and interface tracker.
oInvolved code deployment process into development server through SWM deployment process.
oInvolved code build and deployment process end -to - end and prepared QC instruction for code deployment to QC servers.
oInvolved in Zookeeper configuration setup in SVN and Zookeeper security and run the zktool to refresh updated configuration and non-secure and secure Cassandra cluster to validate user level authorization.
oInvolved in writing XML Schema Definition (xsd), design and java object mapping.
oJAXB used for marshalling and unmarshalling between Java and XML objects
oImplemented code to start the initial and delta load listeners to consume the solace messages from GDDN application and loading data into Cassandra nodes.
oImplemented the Cassandra script for creating key-spaces and column families.
oWhite box testing was done by using JUnit API.
oWrote the code for bulk data loading into Cassandra nodes.
oDeveloped java test client for multiple applications for testing with all API's doing all kind of crud operations on Cassandra nodes.
oWorked on boot strap to load configuration files to communicate between Java API and Data base.
oWorks with Release team and done hot fixes and updating release notes for every deployment.
oDone POC for reading messages from Kafka queue using spark streaming.
Epsilon Aug 2013 – July 2014
Project
Segmentation, CDS
Client
Epsilon
Role
Sr. Java & Big Data Consultant
Duration
Aug 2013 – July 2014
Environment
JDK 1.7 Tomcat 7.0, Spring, Hadoop, Hbase, Cassandra, Cloudera CDH, Apache Hadoop, Hive, Hue, hbase, Impala, Oracle Virtual Box, Cloudera, STS3.2, Spring, JPA, Hibernate, restful web services, 3.0,Git, SourceTree, Jenkins, WinSCP, Json Script, Gradle build tool, JMeter 2.10, MYSQL 5.6, Rally, ActiveMQ, tibco server, gems, liquibase & Agile
CDS-Segmentation
Customer Data Store (CDS) is a platform intended to onboard Partners who will have capabilities to create, edit, upload Customer and Transaction related data. The proposed system will be an integrated enterprise system with capabilities to communicate with external Partner software systems as well as internal software systems.
CDS platform will be flexible for Partners to be able to choose to store their data in an RDBMS store like MYSQL or Big Data store like HBASE, Hive & Impala.
Responsibilities:
oImplemented CDS API’s logic through spring.
oImplemented Segmentation API’s logic through Spring
oUsed the restful web service for all CDS and Segmentation controllers.
oWhite box testing was done by using JUnit API.
oInvolved in data set up in HDFS file system for multiple clusters for file-import and file-export.
oSpring AOP was used to create a generic exception handling mechanism for CDS and Segmentation as well as to handle annotation based Transaction of all API’s.
oMYSQL calls were handled through Hibernate 3.0 over JPA.
oImplemented JSON Script for testing CDS & Segmentation API’s using postman –Rest client.
oWorks with Release team and done hot fixes.
oUsed Design patterns like Singleton, DTO, DAO, Factory, Abstract Factory, Strategy Factory and Builder.
oInvolved in doing configuration setup for non-secure and secure cluster to use hive, hbase & impala.
oImplemented Jmeter script for testing CDS & Segmentation all API’s through JMeter and json script(Rest-Client)
oInvolved in Oracle VM configuration, cloudera configuration for VM’s setup end to end
oImplemented JMS services code to expose the messages for segmentation and CDS all jobs.
oImplemented the liquibase script for creating table’s metadata into MYSQL.
Aeroxchange Apr 2012 – July 2013
Project
AeroComponent, Fhos Portal,AeroRepair, Punch-out
Client
Aeroxchange
Role
Sr. Java Developer.
Duration
Apr 2012 –July 2013
Environment
JDK1.6, Tomcat 5.0.28, 5.5, JSP, Servlets, Spring STS, Eclipse Juno, GWT, Vaadin, Spring 3.0(DI,MVC,AOP), Struts 1.3.8, JAX-WS JAX-RS Jersey, JAXB,SOAPUI, Oracle 9i,11g, Ajax, JPA, Hibernate 3.0,CVS, ANT1.6, Maven 3.0.6,Jenkins, WinSCP, HTML,DHTML, Javascript, JUnit,Solaris 4.0, JIRA, Altova XML Spy, JQuery, JQueryUI, JUnit, Mockito, Selenium, Agile & TDD
Description:
Aeroxchange Ltd. is a privately held corporation headquartered in Irving, Texas. The company was created by 13 global airlines as a means of creating value for all trading partners in the aviation supply chain. Their hosted internet site provides an open, neutral marketplace between suppliers and airlines to streamline the procurement and repair of technical and commercial assets, facilitate on-line quote and proposal activity, facilitate auctions and reverse auctions of surplus assets. In simple words they provide aviation supply chain solutions.
AeroComponent (Contract and Service Order Management)
AeroComponent offers a comprehensive suite of eCommerce solutions and tools for the Maintenance, Repair and Overhaul (MRO) sector of the commercial airline industry. These eCommerce solutions can be used by both customers and sellers of MRO services and are available in both web-based and fully integrated environments, and many combinations thereof.
The AeroComponent product is designed to manage pool programs wherein a pool provider creates a pool of parts and allows customers to access the pool in return for charges that are based on flight hours flown by the customer. Customers will exchange documents between their legacy systems and Aeroxchange using XML messages over the Internet.
AeroRepair manages the repair order lifecycle from start to end and serves as the communication medium between supplier parties submitting a component for repair and parties fixing it.
Fhos Portal Developed login frame work, sales, job functions, single sign on, authentication & authorization
Punch-out Catalogs can be either hosted by FHOS marketplace or managed by supplier through punch-out capability.
Responsibilities:
oImplemented navigational logic through Struts MVC (Aerorepair)
oImplemented navigational logic through Spring MVC (Punch-out, FHOS).
oImplemented navigational logic through Vaadin (Aerocomponent). GWT was used to incorporate client based widgets
oWhite box testing was done by using JUnit API.
oSpring AOP was used to create a generic exception handling mechanism for AeroComponent as well as to handle annotation based Transaction History of components.
oOracle calls were handled through Hibernate 3.0 over JPA.
oNamed Queries, cascading, component mapping, JPQL were some of the features also implemented via JPA.
oUsed Design patterns like Singleton, DTO, DAO, Factory, Abstract Factory, Template, Builder, Iterator, Chain of Responsibility and Façade.
oJAX-RS (Jersey implementation) and AeroComponent EDI components.
oJAXB (xjc-compiler) was used for marshalling and unmarshalling between Java and XML objects.
Citi Bank May 2010 –Feb 2012
Project
Validation Component, Logger component, Maker Checker Component, Scheduler component etc.,
Client
Citi Bank ( Banking and Finance Service)
Role
Sr. Java Developer cum lead
Duration
May 2010 –Feb 2012
Environment
Java 1.6, JSP, Servlets, Hibernate 3.0, Spring, EJB 3.0, JAXB, WebSphere 6.1, IRAD, SVN, Oracle, Toad, Ajax, JPA, JIRA, Hibernate 3.0,CVS, ANT1.6, Jenkins, WinSCP, HTML,DHTML, Javascript, JUnit.
Description:-
Foundry is the core platform of CITI that provides a technical base for development of Low Value Payments platform globally. We develop a list of Components/API’s for JAVA developers that will help them to implement various generic functionalities. This platform helps CITI to adopt a common approach in terms of business logic and technology to achieve generic functionalities across various JAVA applications
Responsibilities:-
oAnalyzing and understanding the requirements for developing the component/API
oImplemented navigational logic through Spring MVC via the use of Annotations
oCode development and deployment.
oCreated API/component to consume city foundry across the city development and created API jar publishing into foundry.
oUsed JUnit to implement test classes to ensure quality through design, code reviews and testing.
oCreated server side business objects using the EJB 3.0.
oCreated the Entity objects using the JPA (Java Persistence API) and done the configuration xml file to map the different persistent unit which is represents the different database.
oCreated the client GUI pages using the JSP, HTML and CSS.
oDeveloped the business objects and mapping the java to back end database calls using hibernate.
oInvolved in service layer implementation using the spring in entire the project.
oImplemented navigational logic through Spring MVC via the use of Annotations.
oWhite box testing was done by using JUnit API
oSpring AOP was used to filer incoming client requests and to implement transaction management for JPA calls.
oNamed Queries, cascading, component mapping.
oUsed Design patterns like Singleton, DTO, DAO, Factory, Abstract Factory, Prototype, Template, MVC, Proxy and Façade.
oJAXB was used for marshalling and unmarshalling between Java and XML objects
oImplemented fitnesse script for testing Segmentation API’s through fitnesse tool.
Blockbuster Online May 2008 to Apr 2010
Project
FOH,BOH,
Client
Blockbuster Inc.
Role
Sr. Java Developer.
Duration
May 2008 to Apr 2010
Environment
Java 1.5, Hibernate, Spring, CVS, Weblogic application server, Eclipse Galileo 3.5, Oracle 10g, JQuery, Rally, JAX-RPC, Junit, Soap UI, SQL developer, toad JMS.
Description:
Blockbuster Inc. is the one of the largest online DVD rental portal for movies and games as well, apart from online business Blockbuster also running many stores across the US region. The site www.blockbuster.com provides lot of rental subscription options to the registered users which includes Total Access, By Mail, Gift Subscription, VIP and Combo. The registered users can also buy the New DVDs as well as previously rented product from the site. The primary work flow is that the online request will be served instantly and the back end process will be done by various batch programs on daily basis and a few on periodically.
Responsibilities:
oImplemented navigational logic through Struts MVC
oWhite box testing was done by using JUnit API.
oSpring AOP was used to create a generic exception handling mechanism for FOH.
oOracle calls were handled through Hibernate 3.0 over JPA.
oNamed Queries, cascading, component mapping.
oInvolved in build and deployment process end to end.
oCreated server side business objects using the EJB 3.0.
oCreated the Entity objects using the JPA (Java Persistence API) and done the configuration xml file to map the different persistent unit which is represents the different database.
oCreated the client GUI pages using the JSP, HTML and CSS.
oInvolved in project build and deployment activities like crating the project branches and Head and merging activities in version control systems as CVS.
oImplemented AJAX Calls in client GUI Applications and JSP, HTML, CSS Servlets and Servlet filtering.
oDeveloped the business objects and mapping the java to back end database calls using hibernate.
oInvolved in service layer implementation using the Spring IOC and spring mvc modules in entire the project.
oUsed the Top link API (Oracle ORM Tool) and Hibernate ORM tools to mapping the java API to back end oracle server to persist the data and doing the CURD Operations throughout the project
Amex Inc. Nov 2007 to Apr 2008
Project
Manage Your Card Account
Client
Amex Inc.( Banking and Finance Service)
Role
Java Developer.
Duration
Nov 2007 to Apr 2008
Environment
JDK 1.6, Eclipse 3.2.2, Apache, struts, JIRA, Oracle 10g, Toad, javaScript, CSS, JUnit & CVS
Description:-
The Manage Your Card Account (MYCA) is a collection of interactive applications that enable American Express customers to manage their American Express Card Accounts via the Internet self-servicing channel. MYCA leverages the majority of Utilities to facilitate this Self-Servicing Functionality. The MYCA applications allow Card members to make use of a number of Self-Servicing functions online. This application provides these features like Pay Bills online, Transfer Balance from one Card to another,Increase Credit Line on Cards, Activate Cards Online etc.,
Responsibilities:
oImplemented navigational control through Controller servlets namely DispatchAction, IncludeAction and ForwardAction, which varied in their implementations.
oDeveloped beans (singleton and non-singleton) and were auto wired to the application.
oPerformed client side validation using JavaScript.
oServer side validation was done through Struts tags, JSP scriplets and expressions
oUsed Design patterns like Singleton, Factory, DAO, Session Façade, Factory
oThe Framework used was MVC.
oWrote PL-SQL Procedures in Oracle 9i to support the GUI.
oWhite box testing was done by using JUnit API.
EDS CORPORATION Jan 2007 to Oct 2007
Project
MCSP
Client
EDS Corporation Inc.
Role
Java Developer.
Duration
Jan 2007 to Oct 2007
Environment
Java /j2ee Servlets, JSP, STRUTS, Oracle, Tomcat 5.0 Eclipse 3.0, Visionael NRM Tool, SQL developer,Javascript, CSS, Putty and WinSCP & CVS. Hibernate, PA
Description:-
EDS is developing an OSS platform to service the customers signed up for the Managed Communication Services Platform (MCSP) and Visionael has been selected as the provisioning tool to be implemented.
Responsibilities:-
oImplemented navigational control through Controller servlets namely