Post Job Free

Resume

Sign in

Data Manager

Location:
United States
Posted:
June 16, 2016

Contact this candidate

Resume:

Saidulu Gundur

815-***-**** acu90a@r.postjobfree.com

Professional Summary

Over 10 years of professional IT experience in Design, Development, Deploying and Supporting large scale distributed systems and applications using JAVA and J2EE which includes experience in Big Data Hadoop technology stack.

Having 2+ years of experience working with Hadoop, HDFS, MapReduce, Yarn framework and Hadoop Ecosystems like Sqoop, Pig, Hive, HBase, Oozie, Flume, Cassandra, Impala and Zookeeper.

Over 8+ years as a Software developer in JAVA Application Development, Client/Server Applications, and Internet/Intranet based database applications involved in developing, testing and implementation for application environments like Windows, UNIX/LINUX platforms using Core JAVA, Spring, Hibernate, JDBC, JSP, HTML, CSS, JavaScript, SQL/PLSQL and Oracle databases.

Experience in setting up the Multi Node Cluster and CDH5 Cloudera Manager Setup and configuring the Hadoop platform.

Experience in installation of software VM, Yarn, Hive, Pig, MapReduce, and Sqoop on mulinode environment.

Good Experience in Classic Hadoop Admin & Development, Yarn Architecture along with various Hadoop daemons such as Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager and Application Master.

Hands on experience in utilizing Hadoop components like Hadoop MapReduce, HDFS, Hive, Sqoop, Pig, HBase, Oozie and Flume.

Experience in writing PIG Scripts and Hive Queries for processing and analyzing large volumes of data.

Strong experience of Pig and Hive analytical functions, extending Hive and Pig core functionality by writing Custom UDFs.

Good Experience in using Map Reduce programming model for analyzing the data stored in HDFS and experience in writing MapReduce codes in Java as per business requirements.

Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

Strong experience of using Pig/Hive for processing and analyzing large volume of data.

Experience on NoSQL databases are HBase and Cassandra.

Excellent understanding and knowledge of Job workflow scheduling and locking tools/services like Oozie and Zookeeper.

Excellent Experience in MYSQL and Oracle databases and writing complex queries.

Experience in Managing and reviewing Hadoop log files.

Experience in managing Hadoop Clusters using Cloudera Manager Tool.

Good Knowledge on Microsoft Azure and HDInsight

Knowledge on Apache Spark, Streaming and DataFrames using Scala language.

Experience in MVC architecture using spring and Struts framework.

Experience in Agile Scrum Methodologies and SDLC Methodologies.

Knowledge on AWS and DevOps Tools (GIT, Jenkins, Sonarqube, Chef and Selenium).

Good expertise in Development tools like Eclipse and Net Beans.

Strong Experience in Object Oriented Analysis and Design and Object-Oriented Programming concepts for implementation.

Worked with different software version control systems like CVS, SVN and Perforce.

Experience in Investment Banking and ERP Finance Domains.

A quick learner organized as well as a keen interest in the emerging technologies.

Good team player with a can do attitude and ability to effectively communicate with all levels of the organization such as technical, management and customers

Technical Skills

Languages:

JAVA, J2EE and PL/SQL.

Web Technologies:

JSP, Spring, Struts, Hibernate, CSS, Java Beans, SOAP REST and AWS: EC2, Cloud formation and EMR.

Big Data Eco System:

HDFS, MapReduce, Hive, Pig, Sqoop, HBase, Cassandra, Oozie, Zookeeper, Impala, Flume, Spark and Scala.

Markup/Scripting Languages:

HTML, Java Script, Shell Script and XML.

Operating system:

Windows, Linux and Unix.

DBMS / RDBMS:

Oracle, SQL Server and MySQL.

IDE:

Eclipse and Net Beans.

Tools: Version Control)

SVN, CVS and Perforce

Frame Works:

Spring and Struts.

Hadoop Distributions:

Apache Hadoop, Cloudera CDH4 and Cloudera CDH5.

DrvOps Tools

GIT, Jenkins, Sonarqube, Chef and Selenium.

Professional Experience

CISCO, San Jose, CA.

Hadoop Lead (Admin/ Developer).

Jan 2015- till date

ERP (Enterprise Resource Planning) Finance.

Cisco Systems, Inc. is the worldwide Leader in Networking for the Internet. Cisco has its applications currently running on Oracle Applications. The support operations are spread across the globe for modules (iProcurement, iExpenses, PO, AP, FA, GL, and Customs) in Oracle financials which include events from Purchase Requisition to Accounting in GL Modules along with interaction with the Boundary system (Upstream/ Downstream application). Which was previously designed and developed in Oracle but on the requirement of client migrating to Hadoop with proper reporting in this ERP Finance we extracted data from Oracle then loaded in to HDFS with the help of Sqoop and Oozie data migration tools. After client want transformation on that data with the help of Map-Reduce programming and implemented data warehouse in MapReduce, Hive and Pig. Then we integrated Hive and HBase with reporting and generated multiple reports as per the client requirements.

Roles and Responsibilities:

Involved in setting up the 64 Multi Node Cluster and CDH4 Cloudera Manager Setup and configuring the Hadoop platform.

Involved in installation of software VM, Hive, Pig, Mapreduce, Sqoop on multi node environment.

Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.

Migrating the data from Oracle in to HDFS using Sqoop and importing various formats of flat files into HDFS.

Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables.

Developed MapReduce programs to extract and transform the data sets and results were exported back to RDBMS using Sqoop.

Developed Chaining MapReduce Jobs. A MapReduce framework for analyzing ERP Finance data.

Loaded the dataset into Hive for ETL (Extract, Transfer and Load) operation.

Worked on data extraction strategies using Sqoop.

Loaded unstructured data into HIVE tables using RegEx and SerDe interface.

Worked on Hive Queries to categorize data of different ERP Finance Reports.

Integrated the HIVE warehouse with HBase for information sharing among teams.

Involved in loading data from UNIX file system to HDFS.

Experienced in managing and reviewing Hadoop log files.

Used Avro and Parquet as the file storage format to save disk storage space

Applied optimization techniques to improve MapReduce performance.

Hive external tables were used for raw data and managed tables were used for intermediate processing.

Written User Define Function for PIG and HIVE.

Followed ATTD and TDD model. Leading the ERP Finance activities from Onshore.

Environment: Apache Hadoop, HDFS, MapReduce, HIVE, PIG, Sqoop, HBase, Oozie, Java, PL/SQL & Zookeeper.

CISCO, San Jose, CA.

Hadoop Developer/ Admin.

Jan 2014 –Dec 2014

IWE-MyExpenses

My Expenses application provides CISCO employees with an at-a-glance summary of expense status information and the ability to create calendar-based expense reports for a trip or other purpose all from your My View home page. It streamlines the most common expense tasks, altogether providing a simplified and more intuitive user experience.

Which was previously designed in Oracle but on the requirement of client migrating to Hadoop with proper reporting in this MyExpenses we extracted data from Oracle then loaded in to HDFS with the help of Sqoop data migration tools. After client want transformation on that data with the help of Map-Reduce programming and implemented data warehouse in MapReduce, Hive and Pig. Then we integrated Hive and HBase with reporting and generated multiple reports as per the client requirements.

Roles and Responsibilities:

Involved in setting up the 64 Multi Node Cluster and CDH4 Cloudera Manager Setup and configuring the Hadoop platform.

Involved in installation of software VM, YARN, Hive, Pig, Mapreduce and Sqoop on multi node environment.

Migrating the data from Oracle and log files in to HDFS using Sqoop and importing various formats of flat files into HDFS.

Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables.

Developed MapReduce programs to extract and transform the data sets and results were exported back to RDBMS using Sqoop.

Worked on Hive Queries to categorize data of different MyExpenses Reports.

Integrated the HIVE warehouse with HBase for information sharing among teams.

Applied optimization techniques to improve MapReduce performance.

Hive External tables were used for raw data and managed tables were used for intermediate processing.

Written User Define Function for PIG and HIVE.

Environment: Apache Hadoop, HDFS, MapReduce, HIVE, PIG, Sqoop, HBase, Oozie, Java, PL/SQL & Zookeeper.

Verizon Data Services, India.

Java Module Lead

Jan 2011 – Dec 2013

Enterprise Solution Platform plus (ESP+).

The Enterprise Solution Platform plus is enhancement of ESP and it serves as database of record for all managed services customers and network detail. It provides information on managed services data such as customer network configuration details, customer site contacts and locations, activity schedules, and milestones, etc. It provides interfacing with others systems such as Order Pro, Coms/Netpro and F&E. It is primarily user by Account Teams, Service Delivery and the Change management group in their day-to-day operations.

Roles and Responsibilities:

Involved in various phases of Software Development Life Cycle (SDLC) of the application like Requirement gathering, Design, Analysis and Code development.

Prepared Design and Technical Specifications for ESP+.

Developed the entire application implementing Spring MVC Architecture integrating Spring JDC frameworks.

Responsible for Development ESP+ and enhancements.

Log4j framework has been used for logging debug, info & error data.

Used Oracle 10g Database for data persistence.

Designed and developed JavaBeans to handle business logic and store persistent data.

Responsible for Unit, Integration, UAT testing and prepared JUNIT Test cases.

Involved in production activities. Adhere to the quality Process.

Environment: Java, Spring, Spring JDBC, JSP, CSS, JavaScript, Oracle, Eclipse, Apache ANT and Weblogic.

Bank of America, India.

Java Developer.

Nov 2009 - Dec 2010

FX Operations

Foreign Exchange Management Information System Operations is the largest financial market, BOA trades with currencies of different countries, transactions are done electronically. The foreign exchange market is global, and it is conducted over-the-counter (OTC) through the use of electronic trading platforms or by telephone through trading desks. The OTC market is also known as the spot, cash, or off-exchange forex market.

Roles and Responsibilities:

Designed and developed closely with the client for requirements gathering.

Prepared Design and Technical Specifications. Involved in Low-Level Design Documentation and database design.

Developed the modules based on struts MVC Architecture.

Developed The UI using JavaScript, JSP, HTML, and CSS for interactive cross browser functionality and complex user interface.

Log4j Framework has been used for logging debug, info & error data.

Used Oracle Database for data persistence.

Designed and developed JavaBeans to handle business logic and store persistent data.

Responsible for Unit, Integration, UAT testing and prepared JUNIT Test cases.

Involved in production activities. Adhere to the quality Process.

Environment: Java, Spring, Hibernate, Oracle, Eclipse, Apache ANT, JavaScript, HTML, CSS, XML and Weblogic.

UBS, Singapore.

Java Developer.

June 2007 - June 2009

TAS Navigator:

UBS-Trade Asset Services Navigator is proposed state of the art web based tool to help UBS Support Analysts. TAS Navigator through it is sophisticated GUI will allow Support Analyst to perform Level 0 & Level 1 tasks like, but not restricted to Monitoring & Health Checks, User Requests and System Incidents. TAS Navigator combines the essence of Incident Management and Knowledge Management. TAS production support group for User requests & System Incidents resolutions in the proposed flow, analysts would launch the TN from Remedy.

Roles and Responsibilities:

Onsite coordinator for the requirements gathering for the applications. Prepared Application over view and Knowledge transfer Documents for Offshore people.

Prepared Design and Technical Specifications for TAS navigator Software Requirement Specification. Designed and developed closely with the client for requirements gathering

Involved in Low-Level Design and database design.

TAS Navigator integration into Remedy configuration and workflow management.

Developed the modules based on struts MVC Architecture.

Developed The UI using JavaScript, JSP, HTML, JavaScript and CSS for interactive cross browser functionality and complex user interface.

Log4j Framework has been used for logging debug, info & error data.

Used Oracle Database for data persistence.

Designed and developed JavaBeans to handle business logic and store persistent data.

Responsible for Unit, Integration, UAT testing and prepared JUNIT Test cases.

Involved in production activities. Adhere to the quality Process.

Environment: Java, Struts, JSP, CSS, JavaScript, Remedy, Oracle, Eclipse, Apache ANT and Tomcat.

UBS, Singapore.

Java Developer.

Sept 2005 - May 2007

Securities Settlement Engine (SSE).

UBS Securities Settlement Engine is a post-execution, securities processing system capable of handling a high volume of transactions across multiple currencies and legal entities it currently handles confirmations, settlement and accounting for Principal, Proprietary, Agency business. Financial Instruments being handled include Cash Equities, Convertibles Bonds, Warrants, Synthetic equity swaps and Foreign exchange positions resulting from these trades. SSE provides the main settlement system for use by equities operations.

Roles and Responsibilities:

Development, Production Supporting and enhancements for the Equities, Settlements and Confirmations Applications.

Involved in the complete SDLC software development life cycle of the application from requirement analysis to testing.

Developed the modules based on struts MVC Architecture.

Developed The UI using JavaScript, JSP, HTML, and CSS for interactive cross browser functionality and complex user interface.

Created complex SQL Queries, PL/SQL Stored procedures, Functions for back end.

Prepared the Functional, Design and Test case specifications.

Involved in writing Stored Procedures in Oracle to do some database side validations.

Writing UNIX shell scripts, Database automated scripts to simplify manual checks

Involved in production activities. Adhere to the quality Process.

Provided Technical Support for production environments resolving the issues, analyzing the defects, providing and implementing the solution defects. Resolved more priority defects as per the schedule.

Environment: Java, JSP, CSS, JavaScript, Struts, UNIX, Oracle, Eclipse, Autosys, Netcool, Mainframe and Weblogic.

Education and Credentials

MCA - Master of Computer Applications.

BCA - Bachelor of Computer Applications.

Certifications

CCDH - Cloudera Certified Developer for Apache Hadoop.



Contact this candidate