Shobhit Ram Patel
Bensalem, PA *****
*******.******@*****.***
Oracle PL/SQL Developer
SUMMARY
Certified Database Developer having 16+ years of IT experience in analysis, design, development, testing, and implementation and troubleshooting/debugging for production in area of Oracle PLSQL/T-SQL/MS-SQL. experienced in PL/SQL which includes creating Procedures, Functions and Packages, Oracle Supplied Packages, Collections, Partitioned Tables, Triggers, Materialized Views, Table Indexing, Dynamic SQL, Cursors, Ref-cursors, Bulk Collect techniques.
Having very strong experience in Database Programming and Development using Oracle/T-SQL/MS-SQL (SQL, PL/SQL) and good experience in creating SQL (DQL, DDL, DML, DCL, TCL), Triggers, Views, User Defined Functions and Complex Stored Procedures.
Experienced in Query Optimization, Debugging and Tuning for improving performance of applications using Explain Plan and having great exposure in working with application programming interface (API). Expertise in performance tuning the complex queries in Oracle using multiple approaches. Analysing the AWR reports, explain plan and execution plan.
Excellent experience in Database Design and Development, Data Modelling, Data Analysis, Data Migration and Conversion. Solid expertise in Oracle Query Performance Tuning extensive working experience in writing various modules likes functions, stored procedures, cursors (Implicit, Explicit, REF and attributes), packages, triggers and creating tables, used XML/JSON/CLOB, types, views and materialized views using Oracle SQL and PL/SQL.
Having strong work experience working in 9i/10g/11g/12c/19c with database objects like tables, views, Materialized views, triggers, stored procedures, functions, packages, indexes.
Proficient in Oracle tools and utilities such as TOAD, PL/SQL Developer, SQL Developer. Proficient in Oracle SQL Tuning and PL/SQL debugging. Knowledge in all new advance PL/SQL concept for making existing process faster like Bulk Collect, For All, Save Exception, Append Mode Insert, Parallel query processing. Experience in performance Tuning using various types of Hints, Partitioning, Indexes, Explain Plan.
Experience in Relational Database Design and writing SQL and PL/SQL in Oracle. Expertise in database partitioning and performance tuning in Oracle. Translate business and data requirements into logical data models in support of Enterprise Data Models, Operational Data Structures and Analytical systems. Perform source data analysis. Good knowledge of RDBMS concepts and extensively worked with Oracle 12c/11g/10g/9i.
Have good knowledge in Advanced Database Replication using Materialized Views. Knowledge in loading external files data into oracle using SQL*Loader, External Tables, export/import utilities and data pump features. Experience in tuning queries by analysing Explain Plan, DBMS_PROFILER and TKPROF implementation knowledge in improving performance of procedures using Bulk Collections and handling Bulk Exceptions.
Implementation knowledge in creating table Partitions, Partitioned Indexes and exchanging partitions and EXECUTE_IMMEDIATE and DBMS_SQL. Good knowledge on Oracle internal architecture. Well versed with 11g/12c new features. Good knowledge in using Analytical functions.
Experience as Database Administrator (DBA) migration of lower version 10g to 11g using export-import command, monitoring Jobs, managing spaces, gathering schema/table stats analyzing index, maintaining Partition and table space, AWR, creating user and grant/revoke permission of user are roles and having knowledge of STATSPACK, Data Pump (EXPDP, IMPDP).
Having good knowledge and experience in ETL Tools (extract transform and load). Involved in phases of development activity like workflow designing adding transformations / lookup /mapping/partitioning/Indexing.
Having good exposure and understanding in Snowflake. Involved in designing pipelines to pick up the data from s3 and populate to the target table (applied transformations if needed). and migration of data from reports generated by various vendors into PostgreSQL Databases using PostgreSQL Export/Import Procedures. Experience with ORA2PG, a tool for migrating database from oracle to PostgreSQL. Having experience in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems and OLTP/OLAP application development in Windows and Unix platforms.
Exclusive knowledge in Identification of User requirement analysis, System Design, writing Program specifications, Coding and implementation of the Systems.
Experience in creating ER diagrams, Data Flow Diagrams/Process Diagrams and data modelling. Involved in complete system Software Development Life Cycle (SDLC) of Data warehousing (ETL) and Power BI. worked on project planning and effort estimation too.
Proficient in performance tuning of complex queries and package/procedure/function and complex queries, Transformations, and Sessions experienced optimizing query performance. Extensively worked on Normalization concepts while designing data models.
Extensively worked on the Root Cause Analysis or Data Analysis for high severity issues and resolved them within the given SLA Prepare test cases & perform Unit / Integration / System / Regression Testing.
Worked in Agile Development model, participated in daily stand-ups calls, providing everyday updates in project movement.
Good knowledge of T-SQL programming, procedures, functions, complex joins and cursors. Having good experience and knowledge in Unix Shell Scripting and jobs scheduling.
Excellent communication and interpersonal skills. Good experience of working in Offshore-Onshore model of project delivery. Worked as a role of tech lead and manages team and timeline of the project.
Thorough experience on software development methodology and quality process. Worked in various projects adhering to phases of software development life cycle, which includes system study, analysis, design, development, testing and implementation. Good Knowledge in data warehousing and relational data base concepts.
Provided post-implementation, application maintenance and enhancement support to the clients and ensuring that all the SLA are met. Recognized Team player with the ability to perform well in a group or individual and can cope well in a high intensity work environment. Possess good analytical, programming and communication skill. Designed databases for various projects also provided solution related to requirement.
Good Knowledge of Dimensional Data Modelling and PLSQL Development and Database partitioning and Permanence tuning in Oracle. Knowledge of Optimizing Code and improving queries efficiency.
Strong Development experience in Reference data migration and Decommissioning projects and good knowledge of AWS, python, spark, py-spark, hive, HiveQL, Hadoop, tabulae and Informatica.
Having good knowledge and exposer on Amazon Web Services (AWS) available services such as EC2 Elastic Compute Cloud, S3 Simple Storage Service, RDS Relational Database Service, AWS Lambda etc. Having knowledge on working with Hadoop HDFS, Big data stack, no sql. Worked with Designing and development team of data models for a new HDFS Master Data Reservoir and one or more relational or object Current Data environments. Used NoSQL database which is used to provides a mechanism for storage and retrieval of data that is modelled in means other than the tabular relation.
Participate in Requirement Gathering, Requirement Analysis and Documentation. Extensive experience in client facing roles and working in multicultural, multi-geographic teams and teamwork.
TECHNICAL KNOWLEDGE
Tools and Utilities
Toad, PL/SQL developer, Informatica, Power BI, Service Now, Confluence, GitHub, Bitbucket, SVN, Service-now, Alloy, Spice works, Jira, CICD, Jenkins pipeline, GitHub, Jira, Hermes Deployment tool, Bitbucket.
DB Tools
SQL and PL/SQL Developer, TOAD, SQL*Loader, ORA2PG, Hadoop, tabulae, AWS.
Databases
Oracle 19C, 12c,11g R2/11g, MS SQL Server 2012, Sybase 15.7, T-SQL, PostgreSQL, Snowflake.
Languages
ORALCE PL/SQL, SQL * Plus, TSQL, python, spark, py-spark, hive, HiveQL, Ms-sql, Aurora DB, T-SQL, MY-SQL, PLPGSQL.UT-PL/SQL, PostgreSQL and SQL Server.
Operating System
WIN NT/2000/2008/10/12/XP/Vista/7, Unix, Linux.
Office Tools
Microsoft Office 2003/2007/2010/2013
Methodologies
SDLC, AGILE, Waterfall
ACADEMIC QUALIFICATION
EDUCATIONAL QUALIFICATION
Education & Credentials B.S.C. Govt. Collage of Science Raipur (C.G) (RSU) 2003.
Professional Affiliations M.C.A. National Institute of Technology Raipur (NIT) formerly known as GEC Raipur. (C.G) (RSU) 2007.
WORK EXPERIENCE:
Ginnie Mae
MAR 23 To Till
BNY Mellon NJ
Sr. Oracle PL SQL Developer.
Description: This Project is aimed at automation of Ginnie Mae application, where application is managing the pools loans for respective stake holders and issues the loans and certifying the loan by authorized signer which have to pass thru many stages like mfpd/sfpdm to Gennie Mae then mainframe EWODS DB then OLTP DB.
Responsibilities
Involved in developing procedure, function package, trigger, partitions, Index, sequence, triggers, creating the required objects making relationships. Writing complex queries for the reports as per the business needs and involved in performance tuning of packaged procedures/functions or report queries and table partitioning/indexing and gathering stats for the related objects. Extracted and analyzed the sample data from downstream and upstream systems to validate the business requirements and created high level technical specification documents.
Involved in modifying the package and validating the changes and creating JIRA request for deployment and involved in tuning activity reading AWR report and explain plan and data modelling (Logical/Physical) using Data Modelers and created both logical and physical data. Worked on Loc batch JOB monitoring for letter of exhibits fixing the data issues and creating new reports as needed by end user. Involved in developing trouble shooting of oracle API.
Closely worked with groups to understand the business process and gather requirements. Analyze business requirements, define development scope and develop technical specification documents. Meet with business/user groups to understand the business process and gather requirements. Created mapping document for downstream systems.
Worked on reports and involved in data model design, Data Analysis and participated in new changes required by the user. And worked for customized reports and database migration as needed by the user. Implemented Star Schema models for the above data warehouses by creating facts and dimensions.
Got good exposure to Snowflake. Involved in designing pipelines to pick up the data from s3 and populate to the target tables (applied transformations if needed). Loading and unloading high volume of DB table using utility SQL loaders /external table.
Prepare technical design documents, Use Cases, test cases and write User Manuals for various modules. Conducts bug fixing, code reviews, and unit, functional and integration testing.
Ensured the quality consistency and accuracy of data in a timely and reliable manner. Creating and supporting new/existing jobs runs through AutoSys work scheduler. Provide weekly/monthly status report to stakeholders.
Did Export full database schema (tables, views, sequences, indexes. trigger), with unique, primary, foreign key.
Involved Design team for optimum storage allocation for the data stores in the architecture, Development of
data frameworks for code implementation and testing across the program. Written PowerShell scripts for mass deployments/migrations on SQL Server/Report Services/AWS Services. Connecting with AWS and using S3 bucket for keeping scripts and objects.
Creating and reviewing deployment script and involved in post deployment verification.
Environment: UNIX, Windows 10, Oracle 19c, Oracle12c, GitHub, Jira, Jenkins, Deployment tool.
Loan IQ
SEP 21 To MAR 23
CREDIT SUISSE Princeton NJ
Sr. Oracle PL SQL Developer.
Description: This Project is aimed at automation of extraction of reports from LIQ DB where data is coming in the form of XML and extracting into fact tables afterward the data is transfers into the many staging Tables base on the mapping sheet where relations are stored in oracle database.
Responsibilities
Involved in developing procedure, function package, trigger, partitions, Index, creating the required objects making relationships. Writing complex queries for the reports as per the business needs and involved in performance tuning of packaged procedures/functions or report queries and table partitioning/indexing and gathering stats for the related objects. Extracted and analyzed the sample data from downstream and upstream systems to validate the business requirements and created high level technical specification documents.
Involved in tuning activity reading AWR report and explain plan and data modelling (Logical/Physical) using Data Modelers and created both logical and physical data. Working on Control-M JOB and GLR reports as needed by end user. Involved in developing trouble shooting of oracle API (application programming interface) .
Worked on Power BI data visualization tool to designing, generating, customizing, Data Analysis and testing the report as per end user need.
Closely worked with groups to understand the business process and gather requirements. Analyze business requirements, define development scope and develop technical specification documents. Meet with business/user groups to understand the business process and gather requirements. Created mapping document for downstream systems.
Worked on Modulo report and involved in data model design and participated in new changes required by the user. And worked for customized reports as needed by the user. Preparing the scripts check in the code in GIT feature branch, deploying the code using CICD pipeline(U-deploy/Jenkin) once built is ready pushing it to higher environment by using the same version number.
Got experienced in Snowflake Loading and unloading high volume of DB table and Involved in designing pipelines to pick up the data from s3 and populate to the target tables (applied transformations if needed). using utility SQL loaders /external table.
Prepare technical design documents, Use Cases, test cases and write User Manuals for various modules. Conducts bug fixing, code reviews, and unit, functional and integration testing.
Ensured the quality consistency and accuracy of data in a timely and reliable manner. Creating and supporting new/existing jobs runs through AutoSys work scheduler. Provide weekly/monthly status report to stakeholders.
Did Export full database schema (tables, views, sequences, indexes. trigger), with unique, primary, foreign key and check constraints and involved in conversion of PLSQL code to PLPGSQL after the basic conversion if needed.
Experienced in solving targeted problems that comes with hosting Databases in Client managed Data Center using AWS EC2 and S3 services.
Involved Design team for optimum storage allocation for the data stores in the architecture, Development of
data frameworks for code implementation and testing across the program. Knowledge and experience with
RDF and other Semantic technologies. Participation in code reviews to ensure that developed and tested code.
conforms with the design and architecture principles.
Worked on GLR/GLP reports. System integration team is focused on support on loaders web services, extracts from SWP. Information management. More effectively manage your business with a comprehensive set of reporting, data management and integration tools that helps you and your clients stay informed.
Written PowerShell scripts for mass deployments/migrations on SQL Server/Report Services/AWS Services.
Creating and reviewing deployment script and involved in post deployment verification.
Environment: Linux Red hat, Windows 10, Oracle 19c, Oracle12c, GitHub, Jira, Jenkins, u-Deploy Deployment tool.
FACTS, Family Child tracking system
OCT 17-SEP 21
DHS (Department of human services), Philadelphia PA
Sr. PL SQL Developer
Description: this project is aimed for managing records child abuse happens under Philly. Which contains reports investigation and case, initially reports are created by hot-line people and if required that reports turn into investigation which will be shared to police department and if this case is having previous history, then will attach the old case other this will be considered as new case.
Responsibilities
Developed and customized oracle objects, Created Complex PL/SQL Procedures, Packages, Functions and Triggers as per requirement or to perform the data Transformations on source data.
Extensively used Toad and PLSQL developer tools to create/customize oracle objects.
Writing procedure, function and complex queries using available new features and involved in deployment and post verification, Data Analysis and involved in production support activity.
Designed reporting solutions by implementing a reporting database that utilizes Materialized Views for storing report data. And loading the tables by pl/sql data processing.
ETL Extracted and analyzed the sample data from downstream and upstream systems to validate the business requirements and created high level technical specification documents.
In ETL involved in Developing/customizing module or project such as the workflow designing, creating mappings / transformation on basis of source and target table. And involved in performance tuning of the developed code.
Using session partition scheme or data base partition.
Working on day-to-day report Request from business and involved in bulk data fixing as per the functionality applying advance analytical functions. Working on several data research requests by business users.
Batch job run and fixing the issue in case of job failure doing debugging the code and applying the data patch if required. Writing complex queries for the reports as per the business needs.
Analyze application issue developing procedure using analytical function and fixing the data issue, providing technical implementation and architectural expertise for system and design changes required for re-engineering.
Designed reporting solutions by implementing a reporting database that utilizes Materialized Views for storing report data. And loading the tables by pl/sql data processing.
Working on day-to-day report requested from business and involved in bulk data fixing as per the functionality applying advance analytical functions. Involved in solving the production issues/tickets created by users and communicating the respective teams accordingly.
Working on several data research requested by business user and debugging the existing code to understand and fixing the issue.
Worked on Performance Tuning/query optimization using Explain Plan analyzed all Oracle related objects, Oracle/Autosys jobs which has the references of columns Performed design, code reviews and prepared detailed documentation.
Writing procedure, function and complex queries using available new features and involved in deployment and post verification. Designed reporting solutions by implementing a reporting database that utilizes Materialized Views for storing report data. And loading the tables by PLSQL data processing. Working on day-to-day report Request from business and involved in bulk data fixing as per the functionality applying advance analytical functions. Working on several data research requests by business users.
Batch job run and fixing the issue in case of job failure doing debugging the code and applying the data patch if required. Writing complex queries for the reports as per the business needs.
Analyze application issue developing procedure using analytical function and fixing the data issue, providing technical implementation and architectural expertise for system and design changes required for re-engineering.
Designed reporting solutions by implementing a reporting database that utilizes Materialized Views for storing report data. And loading the tables by PLSQL data processing.
Working on day-to-day report requested from business and involved in bulk data fixing as per the functionality applying advance analytical functions. Involved in solving the production issues/tickets created by users and communicating the respective teams accordingly.
Got a good knowledge and trainings on Amazon Web Services (AWS) available services such as EC2 Elastic Compute Cloud, S3 Simple Storage Service, RDS Relational Database Service, AWS Lambda etc...
Working on several data research requested by business user and debugging the existing code to understand and fixing the issue.
Environment: Windows 10, Oracle19c, Oracle12c, Informatica 10.1, AutoSys, GitHub, Jenkins.
KYC (KNOW YOUR CUSTOMER and EQUITY)
Dec 14 – Oct 17
Citi Bank, Jacksonville FL
Oracle PL SQL Developer.
Description: This Project is aimed at managing records of customers in applications and making it visible to all over the regions as per the rule and laws of customers belonging regions. Time to time changes happened in the application when regulatory new decision or rules introduced for any of the region over the world. Customer records inside the KYC application is highly secured and confidential and not discloses to anyone without proper approval. Application also helps for managing single records all over the globe to avoiding data redundancy duplicate record. Application having many modules likes antimony laundering and country appendix/cards. Application is used for recording all user information like name, address, source of income, politician, small/large size organization, min/max income, charitable organization. KYC is used for gathering detailed information of all customers and it follows all rules made by local government bodies.
Responsibilities
Analyzed the current technology dispositions and migrated the applications to private cloud with upgraded technologies.
Generated implementation assessment report, Remediation reports and UAT reports for all the migrated applications.
Used Toad and Pl/Sql developer tools to create/customize oracle objects. Developed and customized oracle objects workflows. Created Complex PL/SQL Procedures, Packages, Functions and Triggers to perform the data Transformations on source data.
Involved in developing procedure, function package, creating the required objects making relationships. performance tuning of packaged procedures/functions or report queries and table partitioning/indexing and gathering stats for the related objects and rewriting logic inside the code. Analyze business requirements, define development scope and develop technical specification documents.
Worked on Unix Shell Scripting and running or monitoring UNIX jobs for Hermes deployment tool.
As Lead PL/SQL developer for this project providing technical implementation and architectural expertise for system and design changes required for re-engineering.
Worked on Crystal Reports using complex query or calling our stored or packaged procedure and generating the report as per user needs in form of chart or graphs.
Designed reporting solutions by implementing a reporting database that utilizes Materialized Views for storing report data. And loading the tables by pl/sql data processing.
Working on the request from business and involved in bulk data fixing as per the functionality applying advance analytical functions, analyze XML and validating the data and developing the object and creating HTML and sending an automatic email as required by user.
Working on several data research requested by business user and debugging the existing code to understand and fixing the issue. Crating data base objects as per new requirements and designing the relationships in between.
Writing complex queries for the reports as per the business needs and involved in performance tuning of packaged procedures/functions or report queries.
Analyze application issue developing procedure using analytical function and fixing the data issue, providing technical implementation and architectural expertise for system and design changes required for re-engineering.
Go-Live Implementation and Post Go-Live Support.
Enhancement of the Automation scripts for validation and migration of the database.
Got exposure on SEI Wealth Platform (SWP). System integration team is focused on support on loaders web services, extracts from SWP. Information management. More effectively manage your business with a comprehensive set of reporting, data management and integration tools that helps you and your clients stay informed.
Handling of various Database refresh requests
Environment: Linux Red hat, Windows 10, Oracle12c/11i, Autosys, Confluence Jira
Harmonized Reporting Format (HRF and EQUITY) and Force
Oct-13 To Nov 14
Citibank, Pune India
Sr. PL SQL Developer.
Description: This Project is aimed at automation of extraction of reports from Ocean SYSTEM where data is coming in the form of raw and extracting into fact tables afterward the data is transfers into the many staging Tables base on the mapping sheet and reference data where relations are stored for Ocean query Name
Responsibilities
Involved in creating procedure, function, packages, creating the required objects making relationships. Closely worked with groups to understand the business process and gather requirements. Analyze business requirements, define development scope and develop technical specification documents.
Meet with business/user groups to understand the business process and gather requirements. Created mapping document for downstream systems. Creating XML and validation the records and fixing the issues and worked on CSS (Cascading Style Sheets) to generate HTML.
Extracted and analyzed the sample data from downstream and upstream systems to validate the business requirements and created high level technical specification documents.
Writing complex queries for the reports as per the business needs and involved in performance tuning of packaged procedures/functions or report queries and involved in performance tuning.
Participated in data model (Logical/Physical) discussions with Data Modelers and created both logical and physical data models. Implemented Star Schema models for the above data warehouses by creating facts and dimensions.
Loading and unloading high volume of DB table using utility sql loaders /external table.
Prepare technical design documents, Use Cases, test cases and write User Manuals for various modules. Conducts bug fixing, code reviews, and unit, functional and integration testing.
Ensured the quality consistency and accuracy of data in a timely effective and reliable manner. Create and Support new/existing job runs through AutoSys work scheduler. Provide weekly/monthly status report to stakeholders.
Creating and reviewing deployment script and involved in post deployment verification.
Environment: Linux Red hat, Windows 10, Oracle10g, AutoSys, Confluence Jira, GitHub, Jenkins, Hermes Deployment tool.
EQUITY (RRR TO TES DATA TRANSFORMATION)
Feb 13 – Oct 13
Polaris (CREDIT SUISSE contactor), Pune India
Tech Lead/Sr. PL SQL Developer.
Description: This Project is aimed at automation of extraction of reports from MARS SYSTEM where data is coming in the form of XML and extracting into fact tables afterward the data is transfers into the many staging Tables base on the mapping sheet where relations are stored for mars query Name.
Responsibilities
Involved in developing procedure, function package, creating the required objects making relationships. Writing complex queries for the reports as per the business needs and involved in performance tuning of packaged procedures/functions or report queries and table partitioning/indexing and gathering stats for the related objects.
Closely worked with groups to understand the business process and gather requirements. Analyze business requirements, define development scope and develop technical specification documents.
Worked as lead Developer and involved in project planning and dividing the work in between team and tracking the status of the project. If any showstopper communicating with respective teams accordingly.
Involved in requirement gathering, planning budgeting and involved in full SDLC life cycle of the project until it is delivered. Also provide the support to production support teams on need basis.
Meet with business user groups to understand the business process and gather requirements. Created mapping document for downstream systems.
Extracted and analyzed the sample data from downstream and upstream systems to validate the business requirements and created high level technical specification documents.
Did Export full database schema (tables, views, sequences, indexes), with unique, primary, foreign key and check constraints and involved in conversion of PLSQL code to PLPGSQL after the basic conversion if needed.
Participated in data model (Logical/Physical) discussions with Data Modelers and created both logical and physical data models. Implemented Star Schema models for the above data warehouses by creating facts and dimensions.
Loading and unloading high volume of DB table using utility sql loaders /external table.
Prepare technical design documents, Use Cases, test cases and write User Manuals for various modules. Conducts bug fixing, code reviews, and unit, functional and integration testing.
Ensured the quality consistency and accuracy of data in a timely effective and reliable manner. Creating and supporting new/existing job runs through AutoSys work scheduler. Provide