Post Job Free
Sign in

Data State University

Location:
Posted:
April 06, 2017

Contact this candidate

Resume:

Rohith Nama

************@*****.***

423-***-****

www.linkedin.com/in/rohithnama

Professional Summary:

Certified IT professional with real-time as well research experience on Application Development and Big-Data Analytics using various languages and tools like Java, Scala and Hadoop-Spark Eco-Systems along with SAS and R.

Hands on experience in developing and deploying enterprise based applications using major components in Hadoop ecosystem such as Hadoop 2.x, YARN, Hive, Pig, Map Reduce, HBase, Flume, Sqoop, Spark, Storm, Kafka, Oozie and Zookeeper.

Developed Statistical Data Models and conducted research on developing data-driven interventions to dementia caregivers using SAS and R.

Contributed extensively in migrating petabytes of data and in developing a Data Lake platform for multiple clients.

Developed POC’s to develop and deploy Integrated Data Platform that enables to perform near real time analytics.

Experience with CSV, Json, Sequence files, AVRO and Parquet, RC file formats.

Implemented Proof of Concepts on Hadoop and Spark stacks along with different big data analytic tools including migration from different databases (i.e. Teradata, and Oracle) to Hadoop.

Hands on experience on front-end UI design as well as back-end development of an application using Java programming and other required tools.

Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.

Technical Skills

Programming

Java, Scala, Python and Unix Shell Scripting

Big-Data Technologies

Cloudera Hadoop, Hortonworks Hadoop, HDFS, MapReduce, Hive, Pig, HBase, Impala, Hue, Sqoop, NIFI, Kafka, Oozie, Flume, Zookeeper, Spark.

Databases

Oracle, SQLServer, Cassandra, HBase

Web Technologies

HTML5, CSS, JavaScript, XML

Operating Systems

Windows, UNIX, Linux Distributions (CentOS, RedHat and Ubuntu)

Other Tools

GitHub, SAS, R, Tableau.

EDUCATION

North Carolina A&T State University Greensboro, NC

MS, Industrial and Systems Engineering, GPA-3.86 May 2016

Jawaharlal Nehru Technological University Hyderabad, INDIA

B.Tech, Electronics and Communication Engineering, GPA-3.4 May 2014

INVENTION DISCLOSURE

Rohith Nama, 2016, BESI Tablet Interface Application, U.S Patent Application-Pending.

Role: Hadoop-Spark Developer, TSYS, Atlanta [July 2016-Current]

Project: TSYS is a credit card processor, merchant acquirer, and a credit issuer company. The projects at TSYS focused on developing an integrated data platform along with data lake to transfer historical and current real-time data from multiple sources (IBM Mainframes and SQL server) to HDFS/HBase via Kafka - Spark Streaming Integration. POC’s were also tested to deploy the system in AWS, Cloudera, MapR and HortonWorks. The data platform that is being deployed helps TSYS to perform near-real time analytics for multiple purpose such as fraudulent transaction detection and processing time estimation/improvement, which adds a great value to the business.

Responsibilities:

Worked towards designing, development, and deployment of a Spark on Hadoop cluster using Spark, Kafka, HBase with an extensive coding in Java and Scala on multiple Hadoop Vendor Systems.

Developed Scala scripts, UDF's using both Data frames/SQL and RDD’s in Spark for Data Streaming, Aggregation and Testing Purposes.

Contributed towards developing a Data Pipeline to load data from sources such as IBM Mainframes to HDFS through Kafka and Spark Streaming-Processing Frameworks as per the requirements.

Processed and transferred the data from Kafka into HDFS/HBase through Spark Streaming APIs.

Performed Advanced analytics, feature selection/extraction using Apache Spark (Machine Learning & Streaming libraries) in Scala.

Written the code to read data from IBM Mainframes to Kafka and then from Kafka to other stages for Data Sanitization.

Worked on writing Java code for converting JSON files to Avro and then to Base64 for more efficiency and enhanced security purposes.

Gained working knowledge on Tableau and contributed towards reporting and dashboards visualization.

Developed Junit tests for testing purposes and to improve the quality of the code.

Developed POC using Scala, Spark SQL and MLlib libraries along with Kafka and other tools as per requirement then deployed on the Yarn cluster.

Developed a POC to configure and Install Apache Hadoop in AWS, Cloudera, Hortonworks and MapR and then compared the efficiency and cost savings as per the client requirements.

Involved in story-driven agile development methodology and actively participated in daily scrum meetings.

Environment: Hadoop, YARN, AWS, Java SE 8, Java, Tableau, Scala, Python, Spark, Spark-SQL, Spark MLlib, MapReduce, HDFS, HBase, HIVE, Pig, Kafka, Flume, Oozie, ZooKeeper, Cloudera- CDH 5.8 Distribution, Hortonworks, MapR, IBM Mainframes and SQL Server.

Role: Hadoop Developer, PNC Bank, PA [Jan 2016 - June 2016]

Project: PNC Bank is a subsidiary of PNC Financial Services group, which offers services in over 2500 branches throughout the USA. This project focused primarily on migrating the existing data from legacy systems to HDFS by building a generic automated framework. A data pipeline was built using Sqoop, Pig, and Hive for Data Ingestion, Processing, and Preparation. The Quality of Data was verified and a platform controller was built for an efficient management and process automation.

Responsibilities:

Worked on Hadoop environment with MapReduce, Sqoop, Oozie, Flume, HBase, Pig, HIVE and Impala on a multi node cluster.

Worked on a 114-node cluster to accommodate 1.3 Petabytes of Data from Oracle server to HDFS cluster.

Written Java Mapreduce codes to check the quality of data and Java code for exceptional and error handling.

Supported Map Reduce Programs those are running on the cluster and wrote MapReduce jobs using Java API.

Data from Oracle Servers was ingested using Sqoop via Metagen using Shell Scripting.

Developed Pig and Hive UDF’s for Data Processing and Preparation before loading into a Hadoop DataStore.

Quality of data was verified in various stages such as Validation, Standardization, Enrichment and Error Handling using XML Parser and Java Mapreduce Programs.

Used Oozie and Autosys for developing a Platform Control Framework to control workflows as well as operations.

Worked on Cluster coordination services through ZooKeeper.

Operational Metadata and Platform Metadata was gathered, analyzed and used for understanding the status at multiple segments.

Monitored workload, job performance and capacity planning using Cloudera Manager.

Involved in project discussions, gathering business requirements, and in developing phases of the project.

Environment: Hadoop, YARN, Java SE 7, MapReduce, HDFS, HBase, HIVE, Pig, Linux, Eclipse, Storm, Flume, Scala, Oozie, ZooKeeper, Sqoop, Unix Shell Scripting, Cloudera- CDH 5.1 Distribution, and Oracle 12g.

Role: Java Developer, Value Labs, Hyderabad, India [June 2013- July 2014]

Projects: Value Labs is a service based company that offers wide range of services to the clients world-wide. The project at value labs focused on developing and supporting a platform to compare the services for insurance, household utilities and personal finance products. The application was designed to help users by providing the comparison based on their requirements.

Responsibilities:

Involved in the Software Development Life Cycle including Requirement Analysis, Design, Implementation, Testing and Maintenance.

Developed the Business Logic using Plain Old Java Objects (POJOs).

The RESTful Web Services have been used to retrieve and update the data which is populated in Front End Views.

Designed and implemented exception handling to handle the exceptions in the application.

Developed Business components using Java Objects and used Hibernate framework to map the Java classes to the oracle databases.

Contributed towards designing the front end using Anjurla-js, JSP, JavaScript, jQuery, CSS and HTML as per the requirements that are provided.

Involved in bug fixing during the Unit testing, Integrated System testing and User acceptance testing.

Used GitHub as a source code maintenance tool to keep track of all the work and all changes in source code.

Involved in Technical documentation, Interacted with Business Analysts, and Supported the application during the warranty period.

Environment: JDK 1.6, Eclipse IDE, Apache Tomcat 7, JSP, JSTL, Anjular-js, RESTful APIs, HTML5, JQuery, CSS, JavaScript, JSP, Servlets, AJAX, Hibernate, Oracle, GIT, Maven.

Research Experience

Role: BESI, Research Assistant, North Carolina A&T State University, NC [Aug 2014-Dec 2015]

Description: BESI is an NSF-NIH funded collaborative project that is aimed to develop an IOT-Android based Data Application for Dementia Caregivers. This project involves data collection through IOT Sensor Network, data cleansing, integration, and finally developing data models to present insightful interventions to Dementia Caregivers for better decision making.

Responsibilities:

Managed and Analyzed large volumes of data obtained from IOT Network.

Developed a data model to predict the future agitation episodes.

Worked towards data gathering, cleaning, integration, mining and analyzing to produce analytical reports.

Led the team to develop an Android application for Dementia caregivers.

Managed a team of 13 people and coordinate with 2 other teams for the effective and efficient development of the Android-Internet of things based technology for dementia caregivers.

Conducted research and work closely with the collaborative teams to gather and document user requirements.

Recruited participants and conducted usability evaluations such as heuristic evaluation, expert reviews, user observation, user testing, and expert evaluation.

Analyzed qualitative and quantitative data obtained during usability evaluation and explore usability issues associated with the android user interface using statistical tools such as SAS and R.

Created video tutorials to educate caregivers on using the android application.

Mentored undergraduates involved in the research project.

THESIS

Inclusive Design and Modeling: A Two-Phased Approach to Designing Dementia Caregiver Empowerment Interventions

The primary focus of the research was to develop a practical approach for the inclusive design of an android application based on the design wheel methodology using personas and user journeys. Furthermore, a simulation study (physical and computer) was conducted using Cybersociophysical System to examine the possibility of developing interventions that prevent agitation episodes in the person with dementia.

PUBLICATIONS

Nama, Rohith, & Smith-Jackson. Tonya Ph.D., C. P. E. (2015, January). Cybersociophysical Systems: A new Human Factors Approach. In Proceedings of the International Annual Conference of the American Society for Engineering Management. (p. 1).

Meda. Harshitha, Nama. Rohith, Belay. Marta, New Bold. Temple, Anderson. Martha, Bankole. Azziza, Smith-Jackson. Tonya (2016) “User Interface Design of an Android Application for Dementia Caregiver Empowerment”. Proceedings of 2016 Industrial and Systems Engineering Research Conference. (Paper accepted).

Nama. Rohith, Smith-Jackson. Tonya, Seong. Younho, McCullough. Matthew (2016). “A Step towards Designing Dementia Caregiver Empowerment Interventions through a Simulation Study”. Healthcare informatics journal. (In progress)

Nama. Rohith, Smith-Jackson. Tonya, Seong. Younho, McCullough. Matthew (2016). “A Practical Approach for the Inclusive Design of an Android User Interface Based on the Design Wheel”. Ergonomics in design journal. (In progress)

ACHIEVEMENTS:

Recipient of a prestigious Outstanding Research Assistant Award from the Dept of Industrial and Systems Engineering.

Recipient of NSF Travel Grant Award to Attend Conferences.

Attended multiple workshops on Data Science, Entrepreneurship, and Safety of Medical Devices by FDA.

Served as a Secretary of Human Factors Student Chapter at North Carolina A&T State University.



Contact this candidate