Post Job Free

Resume

Sign in

Data Sales

Location:
Glen Allen, VA
Posted:
October 04, 2020

Contact this candidate

Resume:

REDDY SP TATAJI Mobile: +1-804-***-****

Email: adgnk7@r.postjobfree.com

Professional Summary:

Over 7 years of IT experience in the development of applications using Java, Scala and Big Data Technologies (Hadoop, Apache Spark, and Python).

Experience in identifying objects based on various identifiers and creating object repository.

Efficient in designing and developing applications with experience in all phases of SDLC.

Expertise in Big data, Hadoop, HDFS, MapReduce, HBase, and ecosystems HIVE & HQL.

Experience as a developer, Scripts preparation, and execution with the framework.

Experience in Agile, Iterative, and Test-Driven Development methodologies.

Good experience on version management tools RTC, SVN, SBT and GIT.

Involved in designing the automation framework, Good understanding of Test NG annotations, and various automation frameworks in selenium.

Strong debugging, root cause, and problem resolution in collecting and analyzing user requirements.

Ability to quickly master new concepts, applications, and work as a team player.

Experience in project documentation and has good logical and analytical abilities.

Ability to perform at a high level, meet deadlines with quality delivery, adaptable to ever-changing priorities.

Ability to work in diverse platforms and industry environments.

Certification: SUN Certified Java Programmer (SCJP)

Technical skills:

Programming Languages

C, Java /Java EE, Scala, MATLAB, Python

Database Technologies

Big data, Apache Spark, Hadoop, HDFS, Map Reduce, HBase & ecosystems like HIVE, Sqoop

Scripting Languages

Pig, PERL, Unix Shell

Other Languages

SQL, HQL, NoSQL, XML, XSL, XSD, JavaScript with JQuery framework, AJAX

Cloud Services

HDP 2.0, CDH4, AWS

Web Technologies

J2EE- JSP & Servlets, Struts, Web services – RESTful

Tools

Selenium IDE, Selenium Web Driver, Test NG, Jenkins, Maven, Eclipse, SVN, SBT, Apache Oozie

SDLC Methodology

Agile, Iterative, and Test-Driven Development methodologies

Version Control Tools

CVS, SVN, GIT

Educational Qualification:

Bachelor of Technology (ELECTRONICS AND COMMUNICATIONS ENGINEERING) – INDIA

Master of Technology (ELECTRONICS -SYSTEMS AND SIGNAL PROCESSING)- INDIA

Pursuing Masters in Information Technology -USA

Case Studies:

1.Data analysis and Visualization using Tableau Fall 2019

In this project, I have taken a use case of banking operations over 12 months to analyze the customer's activity based on different criteria like job category, region, age, and balance. Used Tableau to visualize the data from the sample UK-Bank-Customers and created a Dashboard that involved a Map Chart for determining the number of records across the regions in the UK, Bar charts for plotting the distribution of sales and distribution of age with the number of records, TreeMap for Job classification, and a pie chart for Gender statistics.

2.Introduction to Business Intelligence using Pyspark. Summer 2019

I have done a Sentimental analysis on streaming twitter data by using flume streaming data dumped into HDFS.

The practical learning experience with Spark Ecosystem (Spark, Python, SparkSQL, Hive) with exposure to software installation and system configuration.

Developed map-reduce programs using Python.

Processed that data by hive and done user analysis on processed data.

Dumped back the data into MySQL.

3. Cloud Optimization for Organizations using AWS Spring 2020

By leveraging the AWS solution, we estimated that pharmaceutical clients have saved the infrastructure and maintenance costs.The source of data is collected from Health care services.

Streaming massive data with AWS Kinesis. Queuing messages with Simple Queue Service (SQS).

Processing data at an unlimited scale with Elastic MapReduce, including Apache Spark, Hive, HBase, and Flume.

Analyzing streaming data in real-time with Kinesis Analytics.

Searching and analyzing data with Amazon Elasticsearch Service. Visualizing your data interactively with Quicksight.

Project Assignments:

6.

Project Name: PEOS

Client

Medicare Services India

Role

Developer

Organization

Digi Matrix Solutions Pvt Ltd

Duration

(01/2016)- (07/2016)

Environment (with skill versions)

Hadoop 2.0, Java 1.7, Apache Kafka, Spark, MySQL, Oracle DW, XML, XSL, JSON, SBT

Project Description:

PEOS (Provider Enrollment and Ownership System) supports the Medicare provider and supplier enrollment process by capturing provider/supplier information from the Medicare family of forms. The system manages, tracks, and validates enrollment data via the web interface. PEOSP is an interface for Providers to submit the new applications, and updates to existing Medicare Enrollments. The Administrative Interface (PEOS AI) allows Medicare Administrative Contractors (MACs) to enter data, process paper forms submitted electronically by Provider Interface (PEOS PI). Using this, MAC can view and manage Medicare Enrollment information.

Responsibilities

To Understand the requirements & architecture and develop Data Model Web Service modules from a Complex Object Relations.

Participated in various stages of the project life cycle mainly design, implementation testing, deployment and enhancement of the application

Used Apache spark and Scala for ETL and data transformation on ingested data.

Used Kafka and Sqoop for data ingestion

Configuring, monitored and fine-tune ingestion of data into Hadoop ecosystem.

Used Tableau for Visualizing & reporting of MAC enrollments data.

Deployed and maintained the code using Maven, SBT build automation tools for all the environments.

Defect fixing, Dev, IT, SIT and Production environment support.

Implemented Log4j for logger messages logging and Error Handling Modules.

Used XML, XSL, JSON concepts in dynamic data exchange and processing.

5.

Project Name: BOFA

Role

Developer

Organization

Digi Matrix Solutions Pvt Ltd

Duration

(03/2014)- (11/2015)

Environment (with skill versions)

Hadoop 2.0, Java 1.7, Apache Kafka, Spark, MySQL, Oracle DW, XML, XSL, JSON, SQL, HDFS file System, Apache MRv1, Hive0.8.1, Hbase0.92, CDH4, Flume1.1, Apache Kafka

Project Description:

Business for Forecasting Analytics is a software-as-service that helps sales, marketing, and customer support persons in different vertical markets in managing the end-to-end customer relationships. It provides extensive granular-level customization features, which can be tailored to the organization's requirements. Measures the effectiveness of different marketing strategies and guides you for your future marketing expenditure. Analyze Potential for cross-selling and up-selling products for Accounts based on their buying patterns. Identify referrals from Accounts and Contacts for promoting new products and services. Store Account and Contact-related notes and documents in the Account history.

Roles and Responsibilities:

Contributing to framework design. Analyzing and Understanding System Requirement Specification.

Analyzing the Applications and Estimating the time required for each change request

Developed multiple MapReduce jobs in java for data cleaning and pre-processing

Importing and exporting log data from servers into HDFS and Hive using Sqoop

Creating, Maintaining & updating database tables in Oracle database.

Load and transform large sets of structured and semi-structured data. Capacity, storage and migration planning for SQL Server data to Hadoop/HDFS

Streaming sample data for test analysis using Flume.

Collected & ingested log data out of various activities and alerts using Apache Kafka.

Developed Python scripts for retrieving data from MySQL database

Develop and design map reducing parser applications extract date from a variety of logs in HDFS and HBase.

Defect fixing, Dev, IT, and ST environment support. Used SVN version control tools to maintain the code.

4.

Project Name: Alcon

Role

Developer

Organization

Digi Matrix Solutions Pvt Ltd

Duration

(02/2012)- (03/2014)

Environment (with skill versions)

Hadoop 2.0, Java 1.7, Apache Spark, MySQL, Hive, XML, XSL, JSON,

Project Description:

Alcon manufactures and distributes optical lenses in the Australian region. It offers progressive, no-glare, ophthalmic, and soft contact lenses; and optical supplies, coatings, and sun protection glasses to eye care practitioners and retail optical chains in the Australian region. The company also operates an optical laboratory network, which serves opticians, optometrists, and ophthalmologists in the Australian region. Also, it offers education and training programs, and business management support services.

Roles and Responsibilities:

Analyzing and Understanding System Requirement Specification.

Creating, Maintaining & updating database tables. Data loading into HDFS from Oracle DB using Sqoop

Created HBase tables to store variable data formats from different source modules

Created Hive Tables and developed Hive queries (HQL) to capture results.

HBase Integration with Hive for ad-hoc querying and reporting.

Export the analyzed data to the relational databases using Sqoop for analysis

Cleanse and update the data using address doctor through Web services

Used regression models to predict and analyze marketing services that need to be awarded based on characteristics such as location, natural resources.

Used K-Means Clustering to segment different countries based on several indices.

Used XML, XSL, JSON concepts in dynamic data exchange and processing.

3.

Project Name: GE Healthcare

Client

Supplier Quality Collaboration System (SQCS)

Role

Developer

Organization

Mahindra Satyam Computer Service Ltd.

Duration

(12/2008)- (12/2009)

Environment (with skill versions)

Eclipse, Hadoop, Java EE, PL/SQL, Struts, JavaScript, Selenium Web driver.

Project Description:

GE Healthcare (formerly GE Medical Systems) is a unit of GE Technology Infrastructure, which is a unit of General Electric (GE). GE Healthcare has a range of products and services that include medical imaging and information technologies, medical diagnostics, patient monitoring systems, drug discovery, and biopharmaceutical manufacturing technologies. Supplier Quality Collaboration System (SQCS) automates supplier change notifications spanning from engineering change requests or engineering change orders and digitizes the supplier change request process. Data are loaded into Operational Data Source (ODS - Database used to feed data for Business Intelligence) through different stages. Data cleansing and Business Rules are applied while the data moving from one stage to another (Source-SSTG-INTR-OSTG-ODS).

Responsibilities:

Extensively used Core Java, Concepts for the Design and Development of application using Hybrid Framework.

Application Design, Development, Database Design, Documentation.

Developed Complex responsive user interfaces for Dashboards creation with HTML, CSS, jQuery, and Widgets frameworks.

2.

Project Name: Accident Analysis for Westminster

Client

DCL, UK

Role

Developer

Organization

Mahindra Satyam Computer Service Ltd.

Duration

(2/2008)- (12/2008)

Participated in various stages of the project life cycle mainly design, implementation testing, deployment, and enhancement of the application.

Deployed and maintained the code using the CVS source code management tool.

Analysis, design, and estimation of functional requirements and change requests

Defect fixing, Dev, IT, ST, and Production environment support.

Implemented Log4j 1.2 for logger messages logging and Error Handling Modules.

Project Description:

The Historical data and the present data of accidents were used for the creation of Data

warehouse with different data marts for different departments with dimensions and fact table using customized tools. The Process included the conversion of the historic data from various sources like flat files, CSV files, and MS Access tables into the Oracle server database, which was designed according to the Data warehousing model. Further, the Warehouse was used to develop reports, which gave comparisons and various types of reports.

Responsibilities:

Analyzing the Requirements and understanding the behavior of the application.

Created Reports using Web Intelligence and Desktop Intelligence tools whereas Business Objects Universe as the Data Provider.

Designing and implementation of universes using Business Objects Designer

Importing and exporting of universes into repository. Involved in Developing and Maintenance of Reports.

Responsible for providing regular updates on the development of test suites to onshore and offshore team leads.

Involved in Developing and Maintenance of Reports. Preparing Test Data for Automation Scripts and Involved in test scripts review.

Providing reporting solutions to client's requirements using crystals reports and documents the solutions and Extensively used Sub queries and Scope of Analysis.

Conducting Peer Reviews. Involved in conducting KT sessions and internal training to new joining.

1.

Project Name: CRM Analytics (Sales Module)

Client

SSU-KPMS-SATYAM

Role

Team Member

Organization

Mahindra Satyam Computer Service Ltd.

Duration

(09/2007) -(1/2008)

Project Description

Customer Relationship Management (CRM) Analytics was designed for capturing, tracking, and monitoring of the sales pipeline of the entire customer-facing units of Satyam. Sales users within Satyam can search for and organize leads and opportunities by nearly any criteria, sales users can capture the supporting efforts that are involved in working any leads or opportunities, sales reviews can be conducted using pipeline and Win/Loss reports from CRM analytics. The reports were designed in such a way that it offers quick availability of information across the organization.

Responsibilities: -

Involved in designing the automation process as well as manual processes regarding the triggering of several events that involve sending a user Email notification, updating of fields in the database, creation of Index files, copying of files from one server to another when an event is triggered.

Creation of the universe of Sales data mart using BO Designer Tool. Developed hierarchies to support drill down reports.

Analysis and Designing new enhancements for every release. I had a good experience in low-level designs. Hands-on experience in developing class and sequence diagrams.

PERSONAL DETAILS

Date of Birth: 24th Aug 1986.

Father’s Name: Appalanaidu Reddy.

Marital status: Married.



Contact this candidate