Oracle database Developer
Over ** years of IT consultant experience as an Oracle PL/SQL Developer in Analysis, Design, Modeling, Development, Implementation and production Support of applications in the client/server environment using Oracle RDBMS.
Extensive experience in Oracle RDBMS design and developing Stored Procedures/Packages, Functions, Triggers, Views, materialized views, Advanced PL/SQL objects, Bulk collections, Ref Cursors, Dynamic SQL, Records and PL/SQL Tables using SQL and PL/SQL and implemented best practices to maintain optimal performance.
Extensive experience in performance tuning (SQL, PL/SQL) using oracle Explain Plan, SQL Trace, TKPROF, Toad, Oracle hints etc.,
Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions to support ETL in data warehouse env.
Partitioned large Tables using range partition technique. Experience with oracle supplied packages such as UTL_FILE and DBMS_SQL.
Good knowledge of the Oracle architecture, Oracle Memory structures, Logical and Physical Structures.
Extensive experience in writing UNIX Shell Scripts (KSH, CSH, SH, Bash), load the data from flat file into database, Cron job, import/export, data pump, backup etc.
Experienced in Extraction Transformation and Loading (ETL) process.
Multiple migration projects to Oracle from legacy systems with Data Migration & Data Conversion.
Complete SDLC experience in requirements gathering, doing analysis, design, coding, testing, implementation and maintenance of the project life cycle.
Good knowledge on logical and physical Data Modeling using normalizing Techniques.
Created Shell Scripts for invoking SQL scripts and scheduled those using Autosys.
Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.
Knowledge on Python scripting.
Skillful at fast-track learning and adopting new technologies.
Highly collaborative team player with the ability to coordinate and communicate well across teams and has a high aptitude of learning and implementing new methods and strategies with proven results.
EDUCATION
M.C.A (Master of Computer Application) from Nagarjuna University, Guntur, India.
M. Tech IT (Master of Technology in Information Technology), Punjabi University, Patiala, India.
SKILLS
Cloud Technologies: Snowflake cloud data warehouse, AWS S3.
Database Tools: Erwin, Toad, TOAD Data Modeler, Oracle SQL Developer, Oracle Enterprise Manager, IMPORT/ EXPORT, SQL*LOADER, Oracle Tuning Pack, Oracle Performance Pack.
RDBMS: Oracle Exadata, Oracle 7.x/8.x/9i/10g/11g RAC, 12C, 19C, MS SQL 2008/2005, Netezza.
ETL Tools: Informatica Power Center 10.2.0, Oracle Data Integrator (ODI 11/12C),
Dell Boomi.
Hardware: Oracle Exadata, Sun Ultra 60 (Multi processor) and Netra (Quad Processor),
Operation Systems: UNIX, SUN SOLARIS 2.8, IBM AIX 4.2, Win NT, Win Server, Windows 11 Pro
Languages: SNOWSQL, SQL, PL/SQL, python, C, Pro*C, C, C++ and UNIX Shell Programming.
Others: Agile, Vi Editor, KSH, BSH, CSH FTP, Cron (scheduler), AWK, SED, Clear Case, Data Structures, SQL, Jira, Visual Source Safe and SVN.
PROFESSIONAL EXPERIENCE
Client: Electronic Transaction Consultants Duration: Feb 2024 – Current
Role: Oracle PL/SQL Developer.
Project: TOLL MANAGEMENT SYSTEM
Environment: Oracle Exadata, Oracle 19c, SQL, PL/SQL, SQL*Plus, TOAD 14.0.75.662, VMware Horizon Client (For Remote Connection), Git Bash, Bit Bucket (Code Repository), GitHub, Shell scripting, Windows 11 Pro Enterprise, Autosys, Jira, PDB tools, Cygwin, Docker, Git Extensions, Jenkins, uDeploy.
Team Size: 10
Description: Electronic Transaction Consultants (ETC), a recognized expert in the field of information systems for the transportation industry. ETC service three of the top 15 toll authorities in the United States.
Responsibilities:
Support migration from TXDOT to Harris County Toll Road Authority (HCTRA).
Created ACCOUNTS_CONV_PKG to stage and load the TXDOT toll tag accounts into HCTRA database.
Created TOLL_TAGS_CONV_PKG to stage and load TXDOT toll tag data into HCTRA database.
Enhance PMT_POSTING_IMPL package for TXDOT other credit and other debit requirements.
Created data load packages person/email/phones/vehicles from TXDOT stage into HCTRA DB.
Worked on prod support/incidents with UI/MT teams and reproduce issue in lower env and created user stories and fix the issues.
Created OCR Summary and Detail report view for the give date range/facility/plazas.
Modify existing PL/SQL Packages/functions/procedures and performed debugging, troubleshooting, problem solving and tuning for improving performance.
Worked with the UI/MT teams to provide them with the necessary Seeding scripts, stored procedures and packages and the necessary insight into the data.
Involved in performance tuning (complex SQL, packages) using oracle tools Explain Plan, TKPROF, TOAD, AWR etc.
Support QA/UAT testing and generate DB builds from code migrations to all non-production env.
Used Hints for query performance and Involved in Database Performance Tuning.
Enhanced and debugged the existing TAG_OWNER packages as per the business requirements.
Client: Fidelity Investments Duration: Jul 2022 – Dec 2023
Role: Oracle PL/SQL Developer (Remote).
Project: Investment Compliance Engine (ICE)
Environment: Oracle Exadata, Oracle 19c, SQL, PL/SQL, SQL*Plus, TOAD 14.0.75.662, Citrix workspace (For Remote Connection), Git Bash, GitHub, Bit Bucket (Code Repository), Shell scripting, Windows 10 Enterprise, Autosys, Jira, IntelliJ IDEA, Karate Automation, Jenkins, uDeploy.
Team Size: 15
Description: Fidelity Investments is an American multinational financial services corporation based in Boston and is one of the largest asset managers in the world with $4.3 trillion in assets under management.
Responsibilities:
Enhance Business entity and Security Cache load packages to ignore Bad number data received from MDM.
Developed new Data load package for Rating Schema factor and Business entity factor rating tables from MDM schema.
Fix the performance issues of rule violation Work item generation package, remove the full table scans by introducing the function-based index, inline views and reorder the necessary join conditions.
Developed new Data Quality check procedures, add exception logic to the following assignment override, broker, external accounts, trader and rule cache jobs.
DB code clean-up for Numerator and Denominator columns from rule and assignment tables/ packages and fix the existing references from the new tables.
Code cleanup - Deprecate the unused stored procedures in the packages.
Worked on prod support/incidents with UI/MT teams and reproduce issue in lower env and created user stories and fix the issues.
Enhance the security data load packages and run the Autosys jobs in higher env to support the Release regression testing.
Modify existing PL/SQL Packages, functions and procedures and performed debugging, troubleshooting, problem solving and tuning for improving performance.
Worked with the UI/MT teams to provide them with the necessary Seeding scripts, stored procedures and packages and the necessary insight into the data.
Involved in performance tuning (complex SQL, packages) using oracle tools Explain Plan, TKPROF, TOAD, AWR etc.
Support QA/UAT/PIT/Perf testing and generate DB builds from code migrations to all non-production environments.
Used Hints for query performance and Involved in Database Performance Tuning using Partitioning and Indexes.
Enhanced and debugged the existing ICE engine packages as per the business requirements.
Created the Autosys jobs to schedule the job to be run in daily basis.
Client: Bank of America Duration: Nov 2021 – Jun 2022
Role: Oracle PL/SQL Developer (Remote).
Project: Operational Risk and Control (OPRC) Management
Environment: Oracle Exadata, Oracle 19c, SQL, PL/SQL, SQL*Plus, TOAD 13.1.1.5, VMware Horizon Client (For Remote Connection), Git Bash, Bit Bucket (Code Repository), Shell scripting, Windows 10 Enterprise, Autosys, Jira.
Team Size: 10
Responsibilities:
Analyzed Business Requirements and translated into technical design document for Strategic Data Analytics and Reporting (SDAR) project, which includes Data Warehousing & BI Reporting.
Created Dimension Tables, Temp Tables, Stage Tables, Load Views, BI_VIEWs & EDD_VIEWs for reporting.
Created Load Views to load data from source to dimension table. Also, other views for downstream to pull the data from downstream to display in UI reports.
Wrote complex SQL queries using With Clause and Analytical function Lag, Lead for the creation of End date for calculation in Dimension table.
Used Hints (Parallel) for query performance and Involved in Database Performance Tuning using Partitioning and Indexes.
Analyzed user queries and recommend query changes, new Table Structures, Existing Table Structures and load processes to enhance performance improvement.
Developed SQL queries and Stored Procedures using PL/SQL to retrieve and insert into multiple database schemas.
Created Job Framework entries for Autosys for loadings (Full load and Delta load) and involved in setting up and scheduling the Autosys jobs to load data in tables.
Developed and modified database Tables, Indexes, BI Views, EDD VIEWS, Alter, Rollback, Grants, With Grant Options, Triggers, Exception handling methods, Packages, Functions and Procedures etc.
Created PL/SQL scripts using Cursors, Ref-cursors, Collections, Bulk Binds, Packages, Procedures, Functions, Dynamic SQL, query execution plan using EXPLAIN PLAN, SQL scripts to monitor the performance.
Modified existing PL/SQL Packages, functions and procedures and performed debugging, troubleshooting, problem solving and tuning for improving performance.
support UAT/QA testing and code migrations from Development to QA, QA to UAT Environments.
Participated in SDAR Legends, OPRC (OpEX Process, Risks and Controls) daily Scrum Stand-Up meetings, Legends connect call, OPRC Team check point call for reviewing the accomplishments, Issues, dependencies and pending daily task activities.
CITI Bank, Irving TX Duration: Nov 2019 – Oct 2021
Role: Oracle PL/SQL Developer.
Project: GENESIS
Environment: Snowflake cloud data warehouse, Linux, Python, Flat files, JSON, CSV, Netezza, Oracle, PL/SQL, Agile, Jira, SQL Loader, Informatica10.2.0, TOAD, Shell scripting, Autosys.
Team Size: 20
Description: Genesis is data warehouse which gets different feeds from different sources and is a centralized database for different down streams for reporting.
Responsibilities:
Managing feeds coming into the genesis system.
Responsible for all activities related to the development, implementation, administration and support of ETL processes for large scale Snowflake cloud data warehouse.
Created database triggers, view, materialized view, procedures/functions and packages using PL/SQL.
Involved in performance tuning (complex SQL, packages) using oracle tools Explain Plan, TKPROF, TOAD, AWR etc.
Created complex materialize views to generate subscription data files for the downstream.
Worked on bulk loading feature of Oracle and SQL Loader for High volume data.
Worked with Development Teams to understand and make recommendations on Proposed Database Structures and Data Flows.
Created Snow pipe for continuous feed files data load.
Created Python scripts to upload/download files to/from S3 bucket.
Handling large and complex data sets like JSON, ORC, PARQUET, CSV files from various sources staged in AWS S3.
Involved in data analysis and handling ad-hoc requests by interacting with business analysts, clients and resolve the issues as part of production support.
Unit tested the data between Netezza and Snowflake.
Extensively involved in analyzing Explain plan and to created Indexes for high volume data tables.
Work with multiple data sources like oracle, external stage (AWS S3), Netezza etc.
Worked with business/Development Teams to understand and make recommendations on Proposed Database Structures and Data Flows.
Rexel, USA Duration: Apr 2019 – Oct 2019
Role: Oracle PL/SQL Developer.
Project: Master Data Management (MDM)
Environment: Oracle Database 12c Enterprise Edition, PL/SQL (stored procedures, functions, packages, Cursor, triggers, etc.), Agile, Jira, Kanban, SQL Loader, Informatica10.2.0, TOAD, Erwin, Import/Export, Linux, UNIX scripting (KORN shell, BOURNE shell and SED etc.).
Team Size: 10
Responsibilities:
Worked with business/Development Teams to understand and make recommendations on Proposed Database Structures and Data Flows.
Written package to load/stage GEXPRO members, Enterprise account, member sales, user group and products, data into MDM staging schema.
Load the Platt Purchase Order data in stage schema and Created Views and materialized views in DWCOMMON environment.
Reviewed and approved the source to target Informatica mapping documents.
Designed and developed mappings and work flows to extract from different source likes db2, SQL Server and oracle using Informatica Power Center.
Enhanced and debugged the existing informatica ETL processes as per the business requirements.
Extensively used SQL*loader scripts to load the flat files data into Oracle database.
Involved in performance tuning (complex SQL, packages) using oracle tools Explain Plan, TKPROF, TOAD, AWR etc.
Understand the data sources including extraction of data, processing of data and delivery of the reports.
Created database triggers, view, materialized view, procedures/functions and packages using PL/SQL.
Participated in scrum meetings and ensure on schedule implementation using agile methods.
Post implementation support- Provided support as part of post-production/delivery activities including review of issues on weekly basis and provide recommendations as input for subsequent design and change activities.
Avaya Inc Duration: Dec 2018 – Mar 2019
Role: Oracle PL/SQL Developer.
Project: A1SR
Environment: Oracle Database 12c Enterprise Edition, PL/SQL (stored procedures, functions, packages, Cursor, triggers, etc.), Agile, Jira, Kanban, SQL Loader, TOAD, Import/Export, Linux, UNIX scripting.
Team Size: 8
Description: A1SR stands for "Avaya (A) One (1) Source (S) Renewals (R)". This is a web-based application platform used by Avaya business users worldwide, for managing "Support Contracts" and "Heritage Avaya Maintenance and Support Contracts".
Responsibilities:
Worked with business/Development Teams to understand and make recommendations on Proposed Database Structures and Data Flows.
Involved in tuning of various processes involving database architecture, design and technical performance.
Work back any issues identified during design or development to BA (Requirements) and Tech Lead
(Architecture)
Pre-production acceptance testing for quality assurance.
Providing recommendations for architecture and process improvements.
Experience in trace errors from stored procedures/functions, Packages and fixed.
Designing, maintenance and management of tools for automation of different operational processes.
Established standards and guidelines for the design & development, tuning, deployment and maintenance of information, advanced data analytics, and physical data persistence technologies.
Created database triggers, cursors, procedures/functions and packages using PL/SQL.
Vizient Inc., IRVING, TX Duration: July 2011 – Nov 2018
Title: Database Architect (PL/SQL PROGRAMMING).
Project: Spend management system (NRE - Oracle Exadata)
Environment: Oracle Exadata, Oracle 10g RAC, Oracle Database 12c Enterprise Edition, PL/SQL (stored procedures, functions, packages, Cursor, triggers, etc.), python, Agile, Kanban, Oracle Data Integrator (ODI 11/12C), SQL*Plus, SQL*Net, SQL Loader, TOAD, Erwin, Import/Export, Linux, UNIX scripting (KORN shell, BSH and SED etc.), Quality Center 8.0, Java, J2EE, Hibernate, Eclipse and SVN.
Team Size: 20
Description: Vizient Inc, is the largest member-driven, health care performance improvement company in America. Backed by network-powered insights in the critical areas of clinical, operational and supply chain performance, Vizient® empowers members to deliver exceptional, cost-effective care at every turn.
Responsibilities:
Responsibilities includes gathering requirements, gaining business consensus, performing vendor and product evaluations, mentoring business and development resources, deliver solutions and documentation
Established standards and guidelines for the design & development, tuning, deployment and maintenance of information, advanced data analytics, and physical data persistence technologies.
Worked with Development Teams to understand and make recommendations on Proposed Database Structures and Data Flows.
Performed release management by undertaking Agile Methodology’s sprints.
Written package for load/stage vendor master, item master, purchase order and account payable, member files.
Responsible for monitoring AWR reports and perform database tuning and SQL tuning to keep the system work in optimal performance.
Used oracle data integrator (ODI) tool to copy Vizient contract, production, member accounts and member price list data from the Oracle Exadata into SQL Server CDS_IM database.
Extensively worked on the complex mappings involving with multiple transformations like Joiner, Aggregators, Filters, Sequence generators, Expression, Sorter etc.
Reviewed and approved the source to target mapping documents.
Involved in tuning of various processes involving database architecture, design and technical performance
Post implementation support- Provided support as part of post-production/delivery activities including review of issues on weekly basis and provide recommendations as input for subsequent design and change activities
Extensively used SQL*loader scripts to load the flat files data into Oracle database.
Involved in performance tuning (complex SQL, packages) using oracle tools Explain Plan, TKPROF, TOAD, AWR etc.
Define data mapping strategies to effectively translate data from the source to the target system, considering data types, formats, and business rules.
Heavily used oracle data integrator (ODI) tool to migrate SQL Server data received from different DB sources to Spend management system (NRE - Oracle Exadata).
Used Erwin and Toad Data Modeling tools to create Database structures.
Participated in scrum meetings and ensure on schedule implementation using agile methods.
AMERIPATH, TX Duration: July 2010 – July 2011
Title: Database Architect (PL/SQL PROGRAMMING).
Environment: Oracle 10g RAC, PL/SQL (stored procedures, functions, packages, Cursor, triggers, etc.), SQL*Plus, SQL*Net, SQL Loader, TOAD, Erwin, Import/Export, UNIX scripting (KORN shell, BOURNE shell and SED etc.), Quality Center 8.0, Java, J2EE, Hibernate, Eclipse and SVN.
Team Size: 10
Responsibilities:
Written new package for Bone Marrow Serial Reporting.
Written procedure/function to Load Patient Reports from non-pathway labs into pathway LIS system.
Consolidation of HLPR schema objects into pathway schema (modify the corresponding packages functions and materialized views in pathway database) for performance and easy maintenance.
Written proc/fun to enable ‘Mnemonic’ search feature in pathway application.
Involved in Conversion/Data Migration of Jackson MS lab into pathway LIS system, Written SQL*Loader scripts to copy Jackson MS lab data into pathway system.
Reload the ACF FL lab modified HLPR reports into pathway system.
Extensively used the DBMS_LOB and xml DOM package to read the lab’s historical input data (xml files) and loaded into pathway HLPR tables.
Involved in master patient index (MPI) validation and performance tuning.
Written/modified lab conversion validation procedures.
Experience in trace errors from stored procedures/functions, Packages and fixed.
Extensively used SQL*loader scripts to load the flat files data into Oracle database.
Involved in performance tuning (complex SQL, packages) using oracle tools Explain Plan, TKPROF, TOAD, AWR etc.
Created database triggers, cursors, procedures/functions and packages using PL/SQL.
Used Erwin and Toad Data Modeling tools to create Database structures.
Used TOAD and temporary tables to load the EXCEL sheets data into pathway schema.
ConocoPhillips, OK Duration: Aug 2009 – Jun 2010
Title: Data Analyst/Developer (PL/SQL PROGRAMMING).
Environment: Oracle 10g RAC, PL/SQL (stored procedures, functions, packages, Cursor, triggers, etc.), SQL*Plus, SQL*Net, SQL Loader, TOAD, Erwin, Import/Export, UNIX scripting (KORN shell, BOURNE shell and SED etc.), Quality Center 8.0 and Visual Source Safe 6.0.
Team Size: 12
Responsibilities:
Written Scada interface PL/SQL scripts to read Cygnet input data and load (insert/update) the data into EC data views.
Written Handheld & Run Sheet interface scripts for Gas, Meter, Tank, Truck & Well to load into EC.
Written interfaces (PL/SQL proc) for WELL_INJ & Tank to load the Scada data into EC data views.
Extensively used the UTL_FILE package to read the Cygnet Historian input data (UNIX flat files) and insert the data into oracle database and write the rejected and error data into bad files.
Developed generic procedure to extract the fields from the input coma delimited string and pass back as a nested array.
Written PL/SQL functions for business objects well test and well measure reporting views.
Written complex SQL (Sub queries and Join conditions), in PL/SQL.
Experience in trace errors from stored procedures/functions, Packages and fixed.
Written UNIX Shell scripts to extract the Meta data from flat files to a specific format and load into Oracle database using SQL*loader.
Involved in performance tuning (SQL, PL/SQL) using oracle tools Explain Plan, TKPROF, auto trace, AWR etc.
Created database triggers, cursors, procedures/functions and packages using PL/SQL.
Extensively used Cron and at commands to automate and schedule batch processes.
Analyze and escalate the reports to higher management.
Developed Unix Shell scripts to automate repetitive database processes.
Allstate Insurance Co, IL Duration: May 2005 – July 2009
Title: Data base Architect/Developer (PL/SQL PROGRAMMING).
Environment: Oracle 10g RAC, PL/SQL (stored procedures, functions, packages, Cursor, triggers, etc.), SQL*Plus, SQL*Net, SQL Loader, TOAD, Erwin, Import/Export, UNIX scripting (KORN shell, BOURNE shell and SED etc.), Quality Center 8.0 and Visual Source Safe 6.0.
Team Size: 18
Responsibilities:
Populated NG Core database with size of 2 million claims (2 Terabytes) for TPT R3.0 file migration test using the oracle data clone Package (procedures & functions).
Extensively used the Erwin (data modeling) tool to understand the application flow, table relationships and developed the above-mentioned Data clone package.
Create the obfuscation scripts/procedure on TPT schemas, these schemas and data are created from production database via RMAN Copy.
Periodically compared the TPT schemas against production schemas and requested the missed Database objects.
Created the Data Wave Request for TPT Schemas.
Extensively used CRON and at commands to automate and schedule batch processes.
Written UNIX Shell scripts to extract the meta data from flat files to a specific format and loaded into Oracle database using SQL*loader.
Involved in performance tuning (SQL, PL/SQL) using oracle Explain Plan, TKPROF, Snapshots etc.
Worked with the fix team in data defect fixing.
Add the new user to the org database and giving the new security org roles to various applications like Next Gen, ERL, STATSDB, Product Catalog, DPS, OCA, MRM and SERPOWER 7.1.
Created database triggers, procedures & functions using PL/SQL.
Review the AWR reports and add the indexes and Oracle hints for better performance.
Developed Unix Shell scripts to automate repetitive database processes.
Creating ECR’s (Environment Change Request) Forms and attach the data fix SQL files to the form for production data fixes. Analyzing and escalate the reports to higher management.
Production support (monitoring, error handling and rerun the failed jobs)
Participation in Data Modeling and Data conversion from Release 1.0 to Release 2.0