Nagendra
PHONE: 412-***-**** E-MAIL:
NAGENDRA.V@SMARTLOGIX.N
ET
SUMMARY
Nagendra has extensive experience in implementing and upgrading Information
Technology (IT) solutions for commercial organizations. Has over 6.5 years
of experience and 5+ projects of progressive consulting in development
along with Architecture/Design experience in Hadoop eco system, Java and
Main Frames.
He possesses a unique combination of functional and technical knowledge in
writing Map Reduce Jobs and Pig scripts. He has extensive experience in
dealing with Apache Hadoop components like HDFS, MapReduce, HiveQL, HBase,
Pig, Sqoop, Big Data and Big Data Analytics. Good Exposure in developing
applications that involves in Java, JSP, Servlets, xslt and java script.
Experience in enhancing the legacy system built in PL/1, COBOL, DB2 and
IMS.
He possesses outstanding communication and interpersonal skills with proven
ability to work effectively with business users, stakeholders, and project
teams to deliver quality result.
Technical skills
Big Data : Hadoop, MapReduce, HDFS, Hive, Pig,
Sqoop, Oozie, HBASE, Flume
Frame Works : Spring, Struts, JSF
Development Tools : Eclipse, RSA, ANT 1.7, Maven 2,
Jenkins, Rapid Application Developer
Languages : Java, J2EE, C/C++, PL1, Cobol
Markup/Script languages: Rational Rose, UML
Version Control : CVS, SVN
Other Tools and Utilities : Xpediter, RDZ, FileAid, Spufi, Sync sort,
Eztrieve, FC File Mover, FTP
Environments : Abend-AID, IBM Utilities-IDCAMS
Databases : Oracle, MS SQL Server, MySQL, SQL/PLSQL, IMS,
DB2
Professional Experience
Jan 2013-Present Hadoop Developer, State Farm Insurance, Bloomington, IL
State Farm wants to create a consistently remarkable experience
to its customer by better understanding the evolving customer
needs and providing the customers a simple, seamless,
Integrated Customer Platform (ICP) that will provide customers
the ability to acquire and service all of their State Farm
insurance and financial services needs across all access
points. In this process huge legacy data is analyzed which is
derived using Hadoop.
. Gathered business requirements from the Business Partners
and Subject Matter Experts.
. Responsible for importing and exporting data into HDFS and
Hive using Sqoop
. Used to manage and review the Hadoop log files.
. Supported Map Reduce Programs those are running on the
cluster.
. Responsible for developing PIG Latin scripts.
. Wrote MapReduce jobs using Java API.
. Involved in managing and reviewing Hadoop log files.
. Imported data using Sqoop to load data from RDBMS to HDFS on
regular basis.
. Developing Scripts and Batch Job to schedule various Hadoop
Program using Oozie.
. Weekly meetings with technical collaborators and active
participation in code review sessions with senior and junior
developers.
. Experience with Scrum methodology.
. Written Hive queries for data analysis to meet the business
requirements.
Sep 2010-Dec 2012 Java Developer, ACI World Wide Solutions, Newton, MA
Enterprise Banker offers a full range of online banking
functionality - from balance and transaction reporting, ACH
origination, wire transfer origination and reporting to remote
check deposit, bill presentment and payment, and cash
concentration. In addition, the solution provides an
infrastructure for easily integrating other products and
services offered by an institution.
. Participated in requirement analysis and design.
. Extracted the Use Cases based on business requirements and
was involved in creation of Class Diagrams, Object
Interaction Diagrams (Sequence and Process) and Activity
Diagrams
. Involved in Development, Debugging, and Unit Testing from
end-to-end of Product.
. Developed Unit Test Cases and performed unit testing to
verify the functionalities.
. Wrote JSP pages and Java Script validations.
. UI development by using XSLT, XML and java script.
. Worked in different releases and patch versions of the
Product.
. Developed action classes and JSP pages.
. Maintained code repository and versioning using SVN
. Logged errors and fixed bugs at different stages of
development process.
. Actively involved in System Integration Testing (SIT) and
User Acceptance Testing (UAT).
Aug 2009-Aug2010 Programmer Analyst, USAA, San Antonio, TX
IMS Data Base Sunset - IMS Data base replacement with DB2 data
base to reduce the maintenance cost.
. Gather the requirements form the client.
. Create impact statement based on the requirements using EA
tool.
. Provide the estimates for the efforts.
. Create the High level design.
. Create the component specification based on High level
design.
. Development of application artifacts.
. Debugging the code using debug tools like expeditor and RDZ.
. Weekly project meetings to ensure smooth progress of the
project.
. Coordinating with offshore in developing, Test case and test
result documents.
. Review of Tasks to ensure the quality of the tasks done by
other teams.
. Prepare the project documents and maintain in lotus Notes.
. Daily status meeting with offshore to ensure the timely
deliverables.
. Supporting the testing team to fix the defects raised during
the system testing.
. Supporting the implementation team in deploying the code to
production.
. Coordinating with Client as and when required to take the
feedback of the client regarding the work done and also
discuss in case of any concern.
Jul 2008-Jul 2009 Senior Developer, AIG Insurance, NJ
Unified Operator Excellence: Mainframe batch application that
gives credit to the staff for each single work they have
completed in processing the new Business and policy changes.
These metrics are loaded to informatica tool for predicting
the future work load and also used to track each individual
performance and staffing.
. Gathering the requirements from client.
. Creating the high level design specification, Logical and
Physical Interface documents and call maps of the
application.
. Preparing the Impact Analysis statement based on the
requirements.
. Program construction / modification as per requirements.
Check the standards of the program.
. Review the changes using the check list documents to ensure
the quality.
. Review of UTP, Code and Test results. Preparation of
detailed Test plan.
. Guiding team members.
. Giving the review to the client in every single phase of the
document.
. Responsible for coding PL1 - DB2 batch programs.
. Used tools to debug the code to resolve the issues faced
during the development.
. Keep track of the code versions and changes through RMS
(revision management tool).
. Carried out unit testing of new programs in Development
region to ensure that the new program would produce the
desired results as per client requirement.
. Run the application using JCL and Programs.
. Executing all tests at various levels including Integration,
System, UAT and Regression testing.
. Provided weekly status report, Weekly Abend log Report to
Client Management.
. Pre and Post implementation support for the project.
. Fixing the defects raised during the system and integration
testing.
May 2007-Jun 2008 Software Developer, AIG Insurance, NJ
AUTO SOT-EOT - Online Main Frame application for processing New
Applications and Policy changes. All the Business that comes to
state farm are automated and those that requires manual
interruption are process using AUTO SOT EOT mainframe online
application. In we have created new screens to enhance the
application.
. Implemented application programs and IMSDC screens for the
processing.
. Create the component specification based on High level
design
. Testing of the programs and jobs to ensure correct results.
. Preparing Test case documents
. Preparing Test results documents
. Review of Tasks to ensure the quality of the tasks done by
the other team members.
. Documenting the programs and analysis of the functionality.
. Coordinating with Client and Onsite as and when required to
take the feedback of the client regarding the work done and
also discuss in case of any concern.
. Pre and Post implementation support
. Fixing the system and performance defects.
Education
Jawaharlal Nehru Technological University, Computer Science Department,
India
Bachelor of Engineering
certifications
. IBM 730 DB2 V9 fundamental certification
. IBM 733 DB2 V9 fundamental certification