Post Job Free

Resume

Sign in

hadoop Big data Developer

Location:
Allen, TX
Posted:
May 09, 2018

Contact this candidate

Resume:

Sridhar Rayappagari

Phone: 814-***-**** Email:ac5ejf@r.postjobfree.com

Professional Summary

Over 8 years of programming experience in the IT field, including three years of experience in Hadoop Ecosystem and remaining 5 years in Database Development: Analysis, Design, and Implementation of Business Applications using the Oracle Relational Database Management System (RDBMS).

Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.

In depth and extensive knowledge of Bigdata Hadoop architecture and various components.

Good Exposure on the design and development of applications for Distributed Environment using Hadoop, Java and other Bigdata technologies.

Expertise with different tools in Hadoop Environment including Pig, Hive, HDFS, MapReduce, Sqoop, Spark, Kafka, Yarn, Oozie, and Zookeeper.

Hands on experience working with NoSQL database including MongoDB and HBase.

Experienced in writing HBase programs using Java API.

Experienced in writing Map Reduce programs using Java, Python.

Experienced in writing SPARK SQL programs.

Extensively used various SPARK RDD Transformations such as map, flatMap, reduce, reduceByKey, group, keyBy, sortByKey, aggregate, aggregateByKey, fold, foldByKey, combineByKey,cogroup etc.

Experienced in writing programs to process NRT Data.

Hands on experience in configuring and working with Flume to load the data from multiple sources directly into HDFS.

Experience with Data flow diagrams, Data dictionary and Database normalization theory techniques, Entity relation modeling and design techniques

Expert in Client-Server application development using Oracle 11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD,SQL*LOADER

Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write & Transportable table spaces

Strong experience in Data warehouse concepts, ETL, Informatica

Good knowledge on logical and physical Data Modeling using normalizing Techniques

Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based)

Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL

Developed materialized views for data replication in distributed environments

Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP

Experience in Oracle supplied packages, Dynamic SQL, Records, PL/SQL Tables and Exception Handling

Loaded Data into Oracle Tables using SQL Loader

Used Insert All, Insert First for multi table inserts.

Partitioned large Tables using range partition technique

Experience with Oracle Supplied Packages such as DBMS_UTILITY, DBMS_SQL, DBMS_JOB, UTL_FILE, DBMS_AQ, DBMS_SCHEDULER.

Created Packages and Procedures to automatically drop table indexes and create indexes for the tables

Worked extensively on Ref Cursor, External Tables, Collections

Expertise in Dynamic SQL, Collections and Exception handling.

Extensively used PL/SQL Data Types: CHARACTER, NUMBER, ROWID, REFCURSOR, RECORDS, and COLLECTIONS.

Used different COLLECTION types like Associative Arrays, Nested Tables, and VARRAY.

Expertise knowledge on Exception handling concepts like Types of Exceptions, Scope of an Exception, Built in Error Functions.

Expert level understanding of SQL processing, SQL Optimizer Operations, SQL Query Optimization methods.

Expert level understanding of SQL Optimizer Operations like Query Transformation, Cost Estimation, and Plan Generation.

Expert level understanding of Query Transformation Techniques such as View Merging, Predicate Pushing, Subquery Unnesting, Query Rewrite with MVs.

Extensive usage of Automatic SQL Tuning features like ADDM, SQL Tuning Advisor, SQL Tuning Sets, and SQL Access Advisor.

Extensively used tools like DBMS_PROFILER, and DBMS_HPROF (hierarchical profile) to identify performance bottlenecks in PL/SQL code.

Experience in using Data Caching Techniques: Package-Based Caching, Deterministic Function Caching, and Function Result Cache.

Expert level understanding of Optimization Techniques like Bulk Processing for Multi row SQL, High Speed Querying with BULK COLLECT, High Speed DML with FORALL.

Experience with Performance Tuning for Oracle RDBMS using Explain Plan, HINTS

Experience in UNIX Shell Scripting.

Experience in Python Shell Scripting.

Excellent communication, interpersonal, analytical skills and strong ability to perform as part of a team

TECHNICAL COMPETENCIES:

Operating System

Linux, CentOS, Windows, Ubuntu, RedHat

Programming Languages

Oracle SQL, Oracle PL/SQL, C, Python, Core Java, Scala

Databases & Tools

Oracle, MySQL, SQL Server, Hbase, MongoDB, Cassandra, PLSQL Developer, TOAD, SQL Developer, SQL Plus

Hadoop Ecosystem

HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Hbase, Apache Spark, Apache Tika.

Tools

SQL Developer, PL/SQL Developer, Toad, Informatica, Visual Safe Source (VSS), HP Quality Center (QC), Anaconda/Spyder/Ipython/Ipython Notebook, Eclipse, PyCharm.

Professional Experience

Nike, Beaverton, OR

Jul 15 – Till date

Hadoop Engineer

Responsibilities:

Work closely with various Business and Technical teams to discuss and understand business requirements.

Translate business requirements to technical requirements and documents technical aspects of the project.

Perform traditional batch processing via MapReduce/Spark framework, coupled with incremental data ingestion into Hadoop Cluster.

Develop efficient MapReduce programs (in Java, Python) to perform batch processes on huge unstructured datasets.

Develop scalable applications using Apache spark in Python and Scala.

Analyzed Hive/Oracle data using SPARK SQL and load results data to reporting layer for Business Analytics.

Extract, Transform, and Load (ETL) Data from multiple data sources (JSON, RDBMS etc) with DataFrames.

Involved in scheduling Airflow workflow engine to run multiple Hive and pig jobs using python.

Develop workflow to load log Data to HDFS using Apache Flume.

Develop PIG scripts for data pre-processing.

Develop, schedule, and monitor Sqoop jobs to ingest data to HDFS, Amazon EC2 Instances from external sources like Oracle, Teradata, and MySQL and vice versa.

Use HiveQL to create partitioned RC, ORC tables, used compression techniques to optimize data process and faster retrieval.

Create Hive External Tables to read Parquet/Avro files from Parquet/Avro Schema.

Read data from Hive tables and perform analytics using SPARK SQL.

Use various spark RDD transformations such as Filter, Map, Reduce, flatMap,keyBy, reduceByKey, fold, foldByKey, aggregate, aggregateByKey, combineByKey etc.

Develop Scala and Python scripts to load data from external systems/files to HDFS, and invoking them using the shell scripts.

Involved in scheduling Airflow workflow engine to run multiple Hive and pig jobs using python.

Develop optimized Hive Scripts and test in Development Environment.

Perform real time analytics using Spark Streaming.

Have used web based notebooks like iPython notebook, and Zeppelin for development and data visualization.

Writing Linux Shell Scripts to run batch jobs, automated processes, and to automate routine systems administration tasks.

Develop programs using HBase Java API to store data in to HBase.

Write Python scripts to perform data analysis using Numpy, Pandas libraries.

Nike, Beaverton, OR

May 14 – June 15

Sr. Application Developer and Performance Analyst

Nike Inc, is the world's largest Sportswear and equipment manufacturer. As a part of their supply chain initiative, Nike had implemented I2 Supply Chain Planner for 3 of their business units namely Apparel, Footwear and Equipment. I2 Supply Chain Planner supports their Supply Planning requirements. They have also implemented APO DP for Apparel, Footwear and equipment. Also, NIKE have initiated steps to leverage APO SNP for its Supply Planning. The supply chain initiatives identified as part of the i2 Application Support & Maintenance project are enabling Nike to meet the following business objectives:

Support their existing demand and supply planning applications. Assist in the APO rollout. Support their APODP application. Wipro’s SCM practice is involved in providing application support for i2 Supply Chain Planner for all the 3 business units at Nike. This involves resolving user queries related to supply planning; providing support for product related issues, helping Nike in improving their Business processes through enhancements to the existing i2 footprint.

Responsibilities:

Lead performance optimization efforts to reduce long running nightly batch operations in order to run within acceptable time period.

Introduced best practices to development team, leveraging Oracle tools such as Oracle Enterprise Manager (OEM), TKPROF, DBMS_PROFILER, Hierarchical Profiler along with IPython notebooks and many Python libraries (pandas, matplotlib, etc) for flexible, interactive data exploration and discovery.

Worked closely with DBAs to fix performance issues caused by Oracle wait events such RC Latch events etc.

Extensively involved in development efforts in building Supply chain planning application for end users (Planners).

Involved directly in brainstorming with Business System Analysts to understand new requirements and enhancements to existing application.

Implemented heavy data transformation logic inside Oracle packages.

Extensively used Analytic functions – rank, row_number, lag, lead - in data transformation.

Worked extensively with Oracle Collections along with most important performance optimization features like bulk Collect, forall, and table function.

Created hundreds of procedures, functions, tables, views, materialized views, triggers, sequences, indexes etc.

Created several partitioned tables using list partition.

Used Oracle inbuilt packages in implement automated email (HTML) system.

Created several shell scripts to invoke scheduled jobs from Autosys.

Designed and implemented application security features.

Involved and lead build activities in Enterprise wide release.

Provide on call support to Batch team in case of any batch failures.

Coordinated with off-shore team and played a facilitator role.

American Express Serve, Saint Petersburg, FL

Apr2013–Apr2014

Sr.PL/SQL developer

American Express is a leading card issuer, innovative payment provider and renowned travel Services Company that provides award-winning service, cutting-edge information and expense management expertise.

Responsibilities:

Took a responsibility of Software Development of Serve Web Apps and Mobile Apps.

Create new high volume database designs for new modules.

Creating and maintaining tables, views, procedures, functions, and packages and performing DML operations using insert, update and delete statements

Translated business requirements into technical requirements and delivered application code that is fully tested and meets the business requirements.

Created automated scripts to perform actions like Creating new table; adding new column; adding an index etc.

Constructed the required data involving complex JOIN statements, including outer-joins, intersect and union all.

Tuned SQL statements using Explain plan, TKPROF for maximum efficiency and performance.

Identified and fixed application performance issues by providing proven scientific solutions.

Wrote SQL, PL/SQL programs to retrieve data from data repository using cursors and exceptional handling.

Created complex SQL queries using Edition views, sub queries, correlated sub queries.

Extensively used cursors, ref cursors and exceptions in developing packages, procedures and functions.

Debugging the code and created error log package to record all the bad records along with error codes and error message.

Involved in different phase of testing like User Acceptance Testing, Unit testing.

Created new automated test programs which are used to perform health-check of an application code.

Took a responsibility of identifying defects and providing high quality solutions.

Took a special care in implementation of application security programs.

Environment: Oracle 11g, SQL, PL/SQL, Windows, TOAD, SQL Developer, MS-TFS, Project Web App

NRG Energy, Houston, TX

Oct2012–Mar2013

Sr.PL/SQL developer

A Fortune 300 company, NRG is at the forefront of changing how people think about and use energy. Whether as the largest solar power developer in the country, by building the nation’s first privately funded electric vehicle charging infrastructure or by giving customers the latest smart energy solutions to better manage their energy use, NRG is a pioneer in developing cleaner and smarter energy choices for our customers.

Responsibilities:

Creating and maintaining tables, views, procedures, functions, and packages and performing DML operations using insert, update and delete statements

Extensively Used Autonomous transaction and triggers to audit the DML operations on tables.

Translated business requirements into technical requirements and delivered application code that is fully tested and meets the business requirements.

Handled bulk operation using different types of Collections (Associated arrays, Nested tables, Varrays).

Created scripts to create database objects like Tables, Indexes, Sequences etc.

Constructed the required data involving complex JOIN statements, including outer-joins, intersect and union all.

Worked with Collections and improved the performance of multi-row queries by using Bulk Collect and Bulk binds.

Tuned SQL statements using Explain plan, TKPROF for maximum efficiency and performance.

Wrote SQL, PL/SQL programs to retrieve data from data repository using cursors and exceptional handling.

Avoided mutating table errors using the new functionality in Oracle 11g i.e., Compound triggers.

Used Global Temporary tables to access the repeated data within a session and improved the performance.

Coordinated with front end team working on Java and web services and provided them the required procedures and packages and necessary insight into the data.

Created complex SQL queries using inline views, sub queries, correlated sub queries.

Extensively used cursors, ref cursors and exceptions in developing packages, procedures and functions.

Debugging the code and created error log package to record all the bad records along with error codes and error message.

Involved in different phase of testing like User Acceptance Testing, Unit testing.

Involved in writing Test cases and performing data validation and process testing for application moving into production.

Worked on SQL*Loader to load data from flat files obtained from various facilities every day.

Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.

Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.

Environment: Oracle 11g, SQL, PL/SQL, UNIX, SQL Developer 3.0.03, Beeline, SVN

Openet, Reston,VA May2012–Sep 2012

Oracle PL/SQL Developer

Responsibilities:

Physical designing and development of Database

Worked on PL/SQL in creating complex stored procedures and functions

Created various database objects like tables, indexes and views

Used Collections like variable arrays, nested tables extensively

Writing triggers, stored procedures and functions required to send the Credit Approval Details

Used various forms of control structures including CASE, DECODE, IF-THEN-ELSE, FOR loops, WHILE loops while developing procedures

Used composite data types like %ROWTYPE and %TYPE

Used the advanced features of PL/SQL like Subtypes, Records, Tables, Object types and Dynamic SQL

Wrote complex SQL queries and generated reports using Reports 6i

Resolved several hardcore business rules and issues at the client site

Environment: Windows XP, IBM AIX, Oracle 8i, SQL, PL/SQL, SQL*Plus, TOAD, SQL*Loader, Oracle Forms & Reports 6i.

Retail Solutions (Rsi), Cranston, RI

Nov2011–Arp 2012

PL/SQL developer

Retail solutions develops and delivers a comprehensive suit of award winning software-as-a-serive (SaaS) solutions that turn downstream data such as point-of-sale (POS),Supply chain,Merchandiser feedback and Category data into actionable visibility into the store and onto the shelf.

.

Responsibilities:

Coordinated with the front end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data

Worked on SQL*Loader to load data from flat files obtained from various facilities every day

Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements

Wrote Unix Shell Scripts to process the files on daily basis like renaming the file,extracting date from the file,unzipping the file and remove the junk characters from the file before loading them into the base tables

Involved in the continuous enhancements and fixing of production problems

Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances

Developed PL/SQL triggers and master tables for automatic creation of primary keys

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart

Created scripts to create new tables, views, queries for new enhancement in the application using TOAD

Created indexes on the tables for faster retrieval of the data to enhance database performance

Involved in data loading using PL/SQL and SQL*Loader and cron jobs calling UNIX scripts to download and manipulate files

Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF, AUTOTRACE

Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan

Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between sql and pl/sql engines

Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package

Creation of database objects like tables, views, materialized views, procedures, packages using oracle tools like Toad,PL/SQL Developer and sql* plus

Partitioned the fact tables and materialized views to enhance the performance

Extensively used bulk collection in PL/SQL objects for improving the performing

Created records, tables, collections (nested tables and varrays) for improving Query performance by reducing context switching

Used Pragma Autonomous Transaction to avoid mutating problem in database trigger

Extensively used the advanced features of PL/SQL, like Records, Tables, Object types and Dynamic SQL

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application

Involved in creating Unix Shell scripts.

Environment: Oracle 11g, SQL * Plus, TOAD, SQL*Loader, SQL Developer, Shell Scripts, UNIX, Windows XP

APS Healthcare

Feb2011-Oct2011

PL/SQL developer

Responsibilities:

Involved in the Analysis, Design, Coding and Testing of the application

Designed Technical design document MD.070 for the business requirements

Created and Modified PL/SQL Triggers, Procedures, Functions and packages

Developed ER Diagrams, Data flow diagrams based on the requirement

Developed SQL scripts to create database objects like tables, views and sequences

Used SQL*Loader to load bulk data from various flat files and legacy systems

Developed SQL and PL/ SQL scripts for transfer of data between databases

Developed complex SQL queries for reports

Developed complex triggers in reports before/after for validation of user input

Performed unit testing and supported integration testing and end user testing

Extensively worked on production issues with effective defect management

Involved in logical and physical database design, Identified Fact Tables, Transaction Tables

Proactively tuned SQL queries and performed refinement of the database design leading to significant improvement of system response time and efficiency

Involved in SQL tuning, PL/SQL tuning and Application tuning using various tools like TKPROF, EXPLAIN PLAN, DBMS_PROFILERetc

Extensively worked with table partitioning and table space creation etc

Implemented Application Infrastructure policies, procedures and standards, and ensured their continual compliance

Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure. Knowledge in use of SQL and Java Transformations

Environment: Oracle 10g, SQL, PLSQL, SQL*Loader, PERL Shell script, TOAD, Informatica 8.6.0

Application Developer

Feb2008-Oct2010

Anblicks

Develop and support various In-house projects.

References Available On Request

Bachelors in Electrical Engineering, 2008,

JNT University, India.



Contact this candidate