Post Job Free

Resume

Sign in

Oracle Developer Sql

Location:
Pompton Lakes, NJ
Salary:
$150K
Posted:
February 18, 2022

Contact this candidate

Resume:

OLEG KHAYKIN

** **** ***** *******, *********, NJ 07457

cell: 201-***-****

email: adp9n8@r.postjobfree.com

Summary

IT engineer with 26 years of experience, working mostly in the Capital Markets business area as well as in Retail, Medical Insurance and Telecommunications.

Software developer, data architect, database designer, Oracle DBA.

Skills

Databases

Oracle 7.0 – 19c, including 200 TB databases; familiar with Exadata features: Smart Scans, Materialized Zone Maps, Hybrid Columnar Compression.

MS SQL Server 2008;

PostgreSQL, Netezza;

Program languages

SQL, PL/SQL - expert;

Java/JDBC/SWT;

C++, Oracle Pro*C;

UNIX shells: Korn, BASH;

Design and Development tools

Oracle Data Modeler, Oracle Designer, ERWin;

TOAD, Oracle SQL Developer, SQL Server Management Studio, Aginity Workbench for Netezza;

Eclipse, IntelliJ;

Oracle Forms, Oracle Reports;

Big Data/Hadoop

Hortonworks HDP 3.1, HDFS, HIVE, Oracle Big Data connectors.

Operating Systems

LINUX: RedHat, Oracle Linux 7;

UNIX: Solaris, AIX, HP-UX;

Windows;

Education and training

1.M.S. Physics, Moscow Institute of Physics and Technology, Moscow, Russia;

2.Oracle DBA Certification program;

3.Oracle 9i Developer courses (Forms and Reports);

4.Oracle Data Integrator (ODI);

Experience

Oradell IT Consulting Corp., President April 2006 - present

Most notable Clients and Projects

CVS/Aetna, Jun 2019 - present Lead Data Platform Engineer

Project:

Support and enhancement of CVS Active Health Management (AHM) information systems

Technology stack:

Linux, Oracle 19c, Google Cloud Platform (GCP), Cloud SQL (PostgreSQL on GCP), Java, Git, Jenkins

Responsibilities:

SQL, PL/SQL, PLpgSQL, Java coding, Unix shell scripting, SQL performance tuning, data modeling using Oracle Data Modeler.

Achievements:

As the local (US) technical lead of an offshore (India) development team, successfully reengineered the CVS AHM Product Setup Automation (PSA) sub-system: converted ~20 thousand lines of poorly written legacy PL/SQL code into ~60 views (pure SQL!), plus a simple PL/SQL package with only ~600 lines of code.

We used the generic ETL solution – DRF – that I developed previously.

As the result, performance increased dramatically (x1000!), system support and further enhancement of the new well-structured code became significantly easier.

I completed POC demonstrating retrieval of Patients’ data from multiple Cloud SQL database tables using a single call to the stored procedure that returned combined data in JSON format, which has better performance in comparison to selecting data separately from each table:

Bank of America Senior Oracle Developer, Oracle DBA

Project:

3 different projects: in 2006, 2011-2013 and 2015-2016;

Technology Stack:

Linux, Oracle 9i - 11g, including Oracle Exadata, MS SQL Server, Netezza, HDFS, Oracle Golden Gate;

Achievements:

In 2006, while working for Merrill Lynch (now BOFA) on the “Direct Market Access” (DMA) project, I successfully resolved the issue of inadequate system throughput – see the details at the end - in the Project details section.

In 2011-2013, while working on the “Product Data Master” (PDM) project, I developed a generic solution for performing complex multi-threaded ETL scenarios within an Oracle RDBMS – a Data Refresh Framework (DRF) - that we used very successfully to populate the PDM data repository. See details in the Project details section.

In 2015-2016, while working on the “Risk Data Reporting” (RDR) project, I implemented a semi-automated process that compressed and purged old data from the database holding historical “stress-test” modelling results for BOFA portfolio positions in Fixed Income, Currencies and Commodities sectors.

My solution consisted of a PL/SQL package and a configuration table that specified data retention policy for all the tables in the RDR database. It was a “hand-made” implementation of what later Oracle offered as Information Lifecycle Management (ILM).

I was able to reduce the amount of used disk space from the original 50TB down to 20TB.

Polo Ralph Lauren, NJ. Jun 2009 – May 2010. Senior Oracle Developer, Oracle DBA

Project:

Integration of Polo ERP systems using IBM Web Sphere MQ and Oracle Streams AQ;

Technology Stack:

AIX, Oracle 10g, Oracle Messaging Gateway (OMG), IBM Web Sphere MQ

Responsibilities:

1.Oracle DB administration and configuration, including Streams AQ and Oracle MGW;

2.SQL and PL/SQL development and performance tuning;

3.Shell scripting;

Achievements:

The project was in danger of missing the deadline because of lack of developers possessing coding skills in IBM MQ environment.

My suggestion to use Oracle Messaging Gateway, convert MQ messages into Oracle AQ messages and do all message processing inside Oracle RDBMS, using PL/SQL – it was a “life saver”.

We were able to put this integration solution to PROD even before the deadline.

Other Clients of Oradell IT Consulting Corp.:

Deutsche Bank Feb 2018 – Jun 2019;

New York City Health and Hospitals Corporation: Feb 2017 – Feb 2018;

BNP Paribas: Jun 2016 – Nov 2016;

Citi Bank: Oct 2014 – Apr 2015;

PIMCO: Feb 2013 – Oct 2014;

Citco Technology Management: May 2010 – Nov 2011;

Credit Suisse: Apr 2006 – May 2007;

J. Crew, May 2007 – Jun 2009 Oracle DBA, lead developer

Project: J. Crew eCommerce and CRM systems.

Roles: PL/SQL developer, ODI developer, DB designer, Oracle DBA;

Responsibilities:

1.Support of J. Crew eCommerce system (www.jcrew.com);

2.SQL and PL/SQL development and performance tuning (OLTP and ETL);

3.Oracle Data Integrator: installation, configuration and development;

4.ETL process management using Control-M;

5.Logical and physical DB design of J. Crew eCommerce data warehouse;

6.Oracle DB administration: 10g and 11g; backup, recovery (RMAN)

Environment: Solaris 10, Oracle 10g and 11g;

Liz Claiborne, Dec 2003 – Dec 2004 Oracle Developer

Project:

Retail Support System

Technology stack:

HPUX, Oracle 9i, Oracle 9iAS, Oracle Forms, Oracle Reports

Responsibilities:

PL/SQL programming: stored procedures, Oracle Forms/Reports;

ETL development: Pro*C, Korn shell

DATATEC SYSTEMS, INC. Jun 2000 - Aug 2003 Software Developer

Project:

Datatec ERP, Workforce automation and CRM systems

Technology Stack:

Windows 2000, Oracle 8i Standard Edition, COGNOS PowerPlay, AS400, DB2.

Responsibilities:

1.Oracle RDBMS installation, upgrade, configuration, tuning and backup.

2.COGNOS administration.

3.Software development: PL/SQL, Java, Object Pascal (Delphi);

Project details:

Year: 2006 Client: Merrill Lynch Project: Direct Market Access (DMA)

Merrill Lynch provided clearing services for Clients trading on NYSE. The Clients and the Exchange used Tibco message queue for communication: 1) Clients placing Orders and getting confirmations, 2) Trades being commenced and then partially or fully executed, 3) Orders being fulfilled or terminated, etc. – information about all these business events was flying back and forth as Tibco messages.

There were multiple Java “listeners” “sitting around” the message queue, reading (but not consuming!) these messages and inserting the corresponding data into a single staging table in an Oracle database.

The ON-INSERT trigger fired. It called a PL/SQL stored procedure that analyzed the inserted information and recorded it properly in multiple “main” tables – Orders, Trades, etc.

The procedure logic was quite complex because the order of incoming data entries might not correspond to the order of business events. For example, a confirmation for receiving a new Order could come before information about that Order placement. This was the result of the Java “listeners” not coordinating their work with each other and working at a different speed. Such inconsistencies were usually resolved within a few seconds with arrival of delayed information.

The task was to assure that up to 1000 events could be processed in one second.

I streamlined the procedure logic and – most importantly – removed multiple COMMITs from it. Multiple COMMITs caused excessive redo log writes and that was the main reason for inadequate throughput. I left only one COMMIT at the very end but made sure that various potential error conditions were properly handled. I created several transaction SAVEPOINTs and rolled back to the proper SAVEPOINT when necessary.

I tested my procedure on the latest (back then) Oracle RDBMS version – 10g, which I installed, configured and tuned.

The result was good: the system could easily process 1000 events per second.

For testing, I developed a stored procedure that could copy consistent sets of data from one database (PROD) to another – QA or DEV. For the given table name and a WHERE condition, the procedure would copy the corresponding data from the specified table as well as data from other tables linked to the given one by FKs - in both “up” and “down” directions. The procedure “followed” those FKs not just to the “nearest neighbors” but kept going along such “FK chains” to the very end. For example, you could start from a particular Trade and end up with copying the corresponding Order and Client info. Or, you could start with the Order and follow it up to its Client and down to its Trades.

Years: 2011-2013 Client: Bank of America Project: Product Data Master (PDM)

“Product Data Master” (PDM) is a data repository that holds complete “life history” of existing Financial Products, with information received from Bloomberg, Reuters, etc., with up to 1 billion individual changes coming each day.

PDM implements “bi-temporal” data storage, with every data element having 2 “live” time periods:

1.A business one – for instance, this Option was issued on this date and will expire on that date;

2.A technological one – this data was entered at this date/time and then was replaced at that date/time.

Populating PDM is a complex multi-step ETL process.

I developed a generic “Data Refresh Framework” (DRF) solution - 2 PL/SQL packages and 2 configuration tables - to perform ETL withing an Oracle RDBMS in a manner of a Directed Acyclic Graph (DAG), with every ETL step (a DAG edge) implemented by a single DML command (INSERT, UPDATE, MERGE or DELETE). DRF generates these commands automatically – based on the provided configuration and based on the column name comparison of the source (view, table or SQL query) and of the target (always a table). Multiple steps can be executed simultaneously – in multiple sessions started via DBMS_SCHEDULER.

A developer in the DRF environment only needs to write plain SQL queries implementing the required business logic – usually in a form of views - and register them in the DRF configuration tables. Please, see a complete DRF description and the source code here.

We successfully used DRF for populating the PDM repository. The performance was great and the visibility of what was happening in the database was superb – thanks to the imbedded logging mechanism – see here.

Later, I successfully used DRF at CVS/Aetna for the PSA project.



Contact this candidate