Post Job Free

Resume

Sign in

Oracle PL/SQL Developer

Location:
Santa Clara, CA
Posted:
September 20, 2017

Contact this candidate

Resume:

Jigar Parekh

Santa Clara, CA *****, USA.

Mobile: +1-913-***-****.

E-mail: ac2dsn@r.postjobfree.com

Experience Summary

•IT: 10 years of experience, delivering quality output in compliance with organization and client process and standards

•Primary experience in role of Development, Support and Project planning.

•Solid technical experience in Oracle database family, Oracle 11G, 10G, 9i, 8i

•PL/SQL Development – coding experience of Packages, Stored Procedures, Functions & Triggers, along with UNIX.

•Strong Analytical skills using Oracle Analytical and Aggregate Functions.

•Good understanding and usage of Sql*Loader to ingest data into the Staging tables.

•Always written robust code by having Exception Handling blocks to have both Oracle supplied Exceptions and User Defined Exceptions.

•Implemented the Partitioning concepts at table level.

•Usage of Ref-Cursers to pass between the client and the code.

•Certified Big Data Developer.

•Comprehensive experience in Big Data processing using Hadoop and its ecosystem (HDFS, Sqoop, Flume, Hive, Pig and Hbase).

•Knowledge of NoSQL databases such as HBase.

•Sound knowledge of Data Warehousing concepts like Star Schema, Snowflake Schema, Kimball and Inmon Methodology.

•Understanding of Conceptual, Logical and Physical Data modeling.

• Good understanding of the Business Workflow.

• Single point contact for business and functional teams for requirement analysis.

Currently working in Eden Prairie, MN and in USA since January 2014.

H1B visa valid till May-2018.

Technical Skills

•DBMS Oracle 8i, Oracle 9i, Oracle 10g, Oracle 11g.

•DWH & NoSql Hive & Hbase

•Language PL/SQL, UNIX, HQL, Pig Latin, Sqoop, Hbase Shell.

•Big Data Platform Hortonworks 2.1, Cloudera.

•Tools & Utilities PL/SQL developer, TOAD, HP Open View, Remedy, Visual Source Safe,

SVN, AppWorx, Edit Plus, UNIX box (telnet), Citrix, SQL developer, Ambari, CDH5.

Certifications & Achievements

•Big Data Developer Training and Certified, 2016.

•Oracle Certified Associate (9i), 2009.

•Six Sigma Green Belt trained, 2008.

•Kaizen Certified 2008, with proven implementation experience in reducing daily batch aborts.

Employment History

Cognizant Technology Solutions Sr. Associate Nov’ 2014 - till date

Patni / iGATE Technical Lead Mar’ 2007 - Nov’ 2014

Education

Bachelor of Engineering (Electronics), University of Mumbai

Big Data Training

Attended the Big Data Developer training in 2016 to gain expertise- conducted by external provider/vendor.

Hands-on project and assignment work on Cloudera VM.

Successfully completed the project on Airline Data (Data Ingestion and Data Transformation) at the end of the course to be certified developer.

Important Project Details

Project #1: GBI iTunes

Client: Apple Inc, USA

Duration: Apr’ 14 – Sep’ 17

Role and Location: Big Data Developer and Sunnyvale, USA

GBI stands for Global Business Intelligence which was part of iTunes Financial application with respective Dashboards exhibiting KPIs relevant to their analytics and sales team to make critical business decisions. Px4 is an application dealing with calculation and distribution on Royalty payments for Apple Book, Apps, News and Music vendors. On the other hand ICA is iTunes Customer Analytics with main intention is to “Populate an Aggregated” form of the iTunes data and provides information related to sales trends of all contents in different regions and countries etc on its dashboard.

Project Responsibilities & Achievements:

Handled multiple file formats including XML, JSON, CSV & other compressed file formats.

Wrote Hive scripts to load data from one stage into another and implemented incremental load with the changed data architecture.

Automating UNIX scripts to test the data loading from Source to Target database.

Created External hive tables as per requirement defined with appropriate static, dynamic partitions and bucketing, intended for efficiency.

Devised archiving logic using hive to ship the data directly into HDFS.

Involved and Developed scripts to run analytics over the ingested data using HQL

Project #2: HDE – HDEIS Data Engine

Client: Optum Insight (UnitedHealth Group), USA

Duration: Jan’ 14 – Apr’ 17

Role and Location: Oracle Lead and Eden Prairie, USA

HEDIS stands for Healthcare Effectiveness Data and Information Set. Employers, consultants, and consumers use HEDIS data, along with accreditation information, to help them select the best health plan for their needs. To ensure the validity of HEDIS results, all data are rigorously audited by certified auditors using a process designed by the NCQA (National Committee for Quality Assurance). Because so many plans collect HEDIS data, and because the measures are so specifically defined, HEDIS makes it possible to compare the performance of health plans on an "apples-to-apples" basis. Health plans also use HEDIS results themselves to see where they need to focus their improvement.

Project Responsibilities & Achievements:

Interacting with business users and analysts, to understand the scope of requirement along with preparing high level technical documents and highlighting the impact of changes.

Estimating and Setting up the mile stones and deadlines for deliverables.

Coding using advance plsql concepts, Analytical functions like Lead, Lag, Listagg, etc.

Using Bulk Collection concepts in order to increase performance by lowering the execution time.

Worked with Partitions like Range and List, creating partitions on a non-partitioned tables.

Created database objects like Procedures, Triggers to code for fulfilling the requirement.

Used Execute Immediate to build dynamic SQL queries.

Closing the functional Gaps and Solving defects.

Using tools like SQL Trace and TKPROF for performance tuning along with using the Explain plan resulting in lower execution time.

Conducting and delivering Knowledge Sharing sessions.

Single point contact for Creating, maintaining and managing the Privileges of the users on all 4 databases i.e. Development, QA, Stage and Production.

Project #3: GFDM Conversion (Group Facts Data Migration)

Client: AEB (Assurant Employee Benefits), USA

Duration: Aug’ 13 – Nov’ 14

Role and Location: Sr. Oracle Developer and Kansas City, USA

GFDM Conversion is a typical migration project. The data will be migrated from legacy system into HDFS system. Here the said relational database system was DB2 from where the data was being migrated. The Production Conversion event is planned every alternate month starting April 2014. The goal is to build up the pool of data to be migrated per event by closing the functional gaps and fixing the defects encountered.

Project Responsibilities & Achievements:

Meticulously planning the conversion events which had 60 steps of execution- at both monthly and weekly levels.

As a part of the Execution Team, executing the conversion event.

Created database objects like Packages, Procedures and functions in order to increase the pool of data to be migrated.

Hands on experience with SQL Loader to load the flat files generated by mainframe application in order to migrate the legacy data.

Worked on deep data analysis to find the gaps in the bad records.

Created Partitioned- Sub-Partitions tables to consume the incoming data.

Introduced Code-repository in order to re-use the code for common defects.

Used UNIX for general file level operations.

Assigning Work / Defects to the associates and offshore.

Conducting and delivering Knowledge Sharing sessions.

Coordinating with the users for discussing the defect and providing a bug free solution.

Gathering statistics and creating monthly project reports.

Internally setting up processes for better tracking and coordinating with offshore.

Project #4: SWIFT (System with Innovative Flexible Transactions)

Client: BUPA (British United Provident Association), UK

Duration: Jul’ 11 – May’ 13 / Staines, UK (Sept’ 11 to Nov’ 11)

Environment: TOAD, SQL Developer

Role and Location: Support Lead and Staines, UK

Swift is a Healthcare Insurance application which takes care of entire process of handling Health Insurance in BUPA. It maintains everything about the customer right from the product and policies taken, their contracts with BUPA, the pre-authorizations and claims raised by them and their billing.

Project Responsibilities & Achievements:

Monitoring the daily support queues.

Assigning work tickets to the associates.

Using PL/SQL coding standards wrote complex quires to produce reports.

Created Oracle Packages, Procedures, Functions, and Triggers along with performing Analytics on the data using Oracle in-build Analytical functions.

Use of SQL Loader, External tables to load client data and perform transformation on it.

Experience in using Unix for basic file handling tasks using grep, find, sed commands etc.

Used Oracle supplied packages such as DBMS_OUTPUT, DBMS_REDEFINITION, DBMS_STATS and knowledge DBMS_SQL, UTL_MAIL, etc.

Writing Exception Handling blocks to capture the errors hence making the code robust.

Conducting and delivering Knowledge Sharing sessions.

Coordinating directly with the users for delivery of tasks within SLA and bug free.

Gathering statistics and creating monthly reports of the support tickets.

Maintaining the repository such that re-usability of code is practiced which eases the efforts.

Project #5: Track and Trace

Client: HDNL (Home Delivery Network Limited), UK

Duration: Oct’ 10 – Jun’ 11

Software: TOAD, SQL Developer.

Role and Location: Onsite Coordinator and Liverpool, UK.

Trace and Trace is the IT hub within HDNL which handles complete life cycle of the Parcel. It is a group of databases, communication devices, online and offline reports that track parcels through the operational lifecycle of HDNL.

Project Responsibilities & Achievements:

Since the Logistics Domain was new for iGATE Patni and even this project, as an onsite coordinator had to quickly get acquainted to the lifecycle of parcel functionally as well as technically.

Solving tickets using oracle Pl/Sql and Unix.

Created complex PL/SQL stored procedures, functions and packages to process large amount data.

Created tables, views and triggers.

Onsite – Offshore coordination.

Project #6: eManufacturing

Client: P&G, Cincinnati

Duration: Jun’ 09 – Sep’ 10

Software: PL/SQL Developer.

Role and Location: DB Lead and Pune.

eMfg, is the Intranet used by the P&G employees enabling some of the critical work processes related to manufacturing, packing and shipping activities followed at P&G. The application is available to employees in the form of individual bundles like eOrg, eOps, eLog, eLearn, eSecurity and eQA.

Project Responsibilities & Achievements:

Was a part of the Dev team which created the eQA (eQuality) bundle from scratch.

Involvement in the Requirement Gathering.

Created complex PL/SQL stored procedures, functions and packages to implement business rules across application for integrating data from multiple sources.

Achieved highest (5/5) CSS rating for this project.



Contact this candidate