Post Job Free
Sign in

Senior PL/SQL Developer with 15+ years DB & ETL expertise

Location:
Reston, VA
Posted:
February 04, 2026

Contact this candidate

Resume:

MITAL K. JUTHANI

*****.*******@*****.***

717-***-****

SUMMARY

Database Engineer with 15+ years of experience across Systems Design, Data Engineering, Performance Tuning, and Enterprise Analytics, leveraging Oracle, MS SQL Server, PostgreSQL, MongoDB, and cloud-based platforms like AWS and Azure.

Proven ability to design and implement hybrid data solutions using relational (SQL), NoSQL (MongoDB), and big data platforms (Databricks with Delta Lake and PySpark).

Hands-on experience in supporting AI/ML initiatives through data preparation, orchestration (AWS Lambda, S3), and integration of model outputs with operational databases.

Skilled in building robust ETL pipelines, optimizing large-scale queries, and delivering business intelligence.

Experienced in all phases of the software development life - cycle (SDLC) with specific focus on the design, development, build and release of quality software.

Significant experience & demonstrated proficiency in developing database scripts and solutions using Oracle PL/SQL, MS SQL Server T-SQL, PostgreSQL PL/pgSQL.

Demonstrated experience with Database architecture, administration, and performance tuning in a web-based application environment.

Experienced in automating, configuring and deploying instances on AWS, Azure environments and Data centers.

Understanding of ETL processes to support Data Conversion / Migration / Consolidation with hands on experience on Informatica 10, DataStage 8.5

Expertise in Enterprise reporting technologies like Tableau, PowerBI, Actuate, BIRT, Cognos and BusinessObjects.

Extensive project experience utilizing waterfall, iterative, Agile Scrum and Kanban methodologies along with most recent Continuous Integration (CI) and Continuous Deployment (CD) practices.

US Citizen with active Public Trust Security Clearance and experience supporting mission-critical DHS programs.

QUALIFICATIONS - CERTIFICATIONS

BE, Bhavnagar University, Gujarat, India.

IBM Certified Cognos 8 BI Reports – Certified Designer

Oracle 11g Certified Database SQL Expert

Oracle 11g Certified PL/SQL Developer Associate

AWS Certified AI Practioner

AWS Certified Data Engineer - Associate

TECHNICAL SKILLS

Database: Oracle 12c/11g2/10g2/9i, MS SQL Server 2012/2008, PostgreSQL 9.4, Teradata Warehouse 8.1, hands-on: Enterprise NoSQL, Cassandra and MongoDB

ETL Tools: Informatica Powercenter 10, DataStage 8.5

Development Tools: Oracle PL/SQL, SQL*Plus, SQL Developer, Toad, MS SQL Server 2012/2008/2005.

Reporting Tools: PowerBI, Tableau, MS SQL Server Reporting Services (SSRS), BIRT, Cognos, Actuate.

Cloud Platform: AWS, Azure, Databricks (Delta Lake, MLflow, PySpark)

AWS Services : Ec2, Elastic Beanstalk, EFS, VPC, RDS, S3, Glacier, IAM, Kinesis, Cloud Front, Cloud Watch, Cloud Trail, Cloud Formation, DynamoDB, Lambda, Route53, SNS, SQS, API Gateway, Code Pipeline, Code Build, Elastic Search, Code Deploy etc.

Source Code Management: GIT, GitHub, MS Team Foundation Server (TFS), SVN, Rational ClearCase, Visual SourceSafe.

MAJOR ASSIGNMENTS

Oct 2019 – Current Sr. Database Developer

Customs and Border Protection, Department of Homeland Security

Ashburn, VA

Business Intelligence Support Services (BISS) is an enterprise analytics service supporting TASPD and the Office of Field Operations (OFO), which assists Custom and Border Protection (CBP) officers and border enforcement personnel to effectively and efficiently:

Identify cargo shipments, individuals, and conveyances that may present additional risk to the United States

Conduct terrorism analysis and global assessments that convey changes in terrorism threats and identify emerging threats

Develop and evaluate CBP-wide intelligence-based targeting rules and intelligence driven special operations, and

Coordinate and enhance analysis and targeting efforts

CBP BISS provides the following data analytic services: -

Structured Reporting: provides recurring reports, dashboards, applications and on-demand queries to answer tactical and strategic questions.

Intelligence Analytics: supports targeting and special operations, including SME-engineered rules, targeting reports and User Defined Rules.

AI/ML models: target high-risk shipments across various threats using advanced custom AI/ML models integrated into Automated Targeting System.

Responsibilities

Developed and maintained Oracle PL/SQL packages, procedures, and scheduled jobs supporting structured reporting, rules engines, and threat detection modules.

Implemented MongoDB to store schema-less data for risk models, user-defined targeting rules, and threat indicator logs, enabling rapid iteration and dynamic configuration.

Designed aggregation pipelines in MongoDB to preprocess unstructured JSON from API logs before downstream transformation and merging with Oracle-based data marts.

Utilized Databricks (PySpark) to cleanse and transform raw data from AWS S3 into Delta Lake tables for scalable ingestion by ML pipelines and BI dashboards.

Collaborated with data scientists to train, track, and deploy ML models using Databricks MLflow; model outputs written to Oracle and MongoDB.

Designed and implemented data validation layers in Oracle to ensure consistency between data lakes (Delta Lake), NoSQL (MongoDB), and relational data stores.

Developed new MongoDB collections to support flexible storage of schema-less targeting rules.

Enabled integration of AI/ML outputs into MongoDB, allowing for dynamic querying and downstream processing.

Used MongoDB Compass for performance diagnostics and aggregation optimization.

Supported ETL processes using PySpark, orchestrated with AWS Lambda and Databricks notebooks; created checkpoints and audit logs in MongoDB for traceability.

Mentored junior developers on hybrid data platform development practices.

Participated in Agile Scrum processes; delivered complex data solutions within 2-week sprint cycles; actively engaged in sprint planning, release planning and backlog grooming.

Environment: Oracle 19c, MongoDB 6.0, AWS Lambda, S3, Python, SQL Developer, JIRA, Bamboo, UNIX, GIT 2.7, Confluence

Aug 2016 – Aug 2019 Sr. Database Developer

Customs and Border Protection, Department of Homeland Security

Alexandria, VA

CBP ADIS application is designed to match arrival and departure records so that the Office of the Attorney General can calculate, for each country, the portion of nationals of that country who arrive, but for whom no record of departure exists, as well as those for whom there are records of departure but who have stayed in the U.S. beyond the Admit Until Date. The goal is to permit the Attorney General to calculate, for each VWP country and each fiscal year, the portion of nationals of that country who arrive in the United States, but for whom no record of departure exists.

ADIS system maintains and matches travel history with changes in immigration status to provide overstay status. ADIS provides “slice and dice” ad hoc query that allows for a combination of 25 key variables for in scope travelers and their events. It also provides a variety of aggregate data reports for different stakeholders including US Congress as a result from Overstay statistics vetting. The data validation is conducted by CBP to improve, maintain, and correct stored information. CBP analyst team will use ADIS Data Integrity Identity Validation (DIIV) Tool to validate out of the country overstay records.

Responsibilities

• Developed and administered stored procedures, triggers, views, functions, procedures & packages.

• Administered development databases and helped troubleshoot and optimized SQL code.

• Participated in meetings and discussions with Functional/Technical Leads to understand the requirements and work out a design for particular user story, bug or TPR.

• Prepared Database Requirements and User story Design documents.

• Write unit tests (document and code) for the Database stored program units/packages

• Debugged and fine-tuned the database routines for performance enhancement.

• Configured various performance metrics using AWS Cloudwatch & Cloudtrail

• Wrote various Lambda services for automating the functionality on the Cloud.

• Maintained user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.

• Configured Cross-Account deployments using AWS Code Pipeline, Code Build and Code Deploy by creating Cross-Account Policies & Roles on IAM.

• Created AMIs for mission critical production servers for backup.

• Collaborated with peers; mentored junior team members to enhance skills and abilities; provided leadership as required.

Environment: Oracle 12c/11g, Toad, AWS, Java, JIRA, Bamboo, UNIX, GIT 2.7, Confluence, Xceedium, Nagios, Cloud Watch

Nov 2012 – Aug 2016 Sr. Database Developer

USCIS Verification Division, Department of Homeland Security

Washington, DC

Worked with USCIS Verifications Division developing, enhancing and operating applications that support Verification information Services (VIS) through e-Verify and SAVE (Systematic Alien Verification entitlement) capabilities. VIS used SSA and DHS databases to conduct automated employment eligibility status verification for government agencies and private employers. These programs provided mission-critical services necessary to ensure the integrity of formal policies reflecting immigration reform initiatives. Legislative initiatives launched in 2006 required CIS to ensure that VIS can be scaled to handle peak loads of 4.6 million employment eligibility transactions in a single month.

The VIS E-Verify system is a key data source for the verification of immigrant status in regard to securing driver’s licenses, Social Security cards and benefits eligibility. It is an online system that allows businesses to verify the U.S. work eligibility of new employees. The SAVE Program is a service that helps federal, state and local benefit-issuing agencies, institutions, and licensing agencies determine the immigration status of benefit applicants so only those entitled to benefits receive them. It also helps agencies determine the immigration status of benefit applicants to ensure that only entitled applicants receive public benefits and licenses.

Responsibilities

• Developed and deployed complex database functions, packages and structures to assist the .Net developers to develop, maintain and enhance the E-Verify and SAVE applications.

• Developed database procedures to support application development around business requirements.

• Debugged and fine-tuned the database routines for performance enhancement.

• Deployed database structures to Stage, Prod and DR in conjunction with USCIS deployment team.

• Configured AWS Multi Factor Authentication in IAM to implement 2 step authentication of user's access using Google Authenticator and AWS Virtual MFA. Also, managed AWS IAM services to grant permissions and resources to users along with managing roles and permissions of users via AWS IAM.

• Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.

• Initiated alarms in Cloud Watch service for monitoring the server's performance, CPU Utilization, disk usage etc. to take recommended actions for better performance.

• Part of team which built a prototype Azure application that accesses 3rd party data services via Web Services. The solution dynamically scales, automatically adding/removing cloud based compute, storage and network resources based upon changing workloads.

Environment: Oracle 11g/10g, MS SQL Server 2012, AWS, MS Azure, .Net, MS Team Foundation Server (TFS), MS Visio, UNIX, Xceedium, GitHub

July 2010 – Nov 2012 BI Team Lead

The College Board, Reston, VA

The College Board is a mission-driven not-for-profit organization that connects students to college success and opportunity. Founded in 1900, the College Board was created to expand access to higher education. Today, the membership association is made up of more than 5,900 of the world’s leading educational institutions and is dedicated to promoting excellence and equity in education. Each year, the College Board helps more than seven million students prepare for a successful transition to college through programs and services in college readiness and college success — including the SAT® and the Advanced Placement Program®. The organization also serves the education community through research and advocacy on behalf of students, educators and schools.

Responsibilities

• Developed complex SOAS (Summary Of Answers and Skills) Actuate report, which was very critical for the client. This was used to track the performance of students which have taken SAT / PSAT / NMSQT exams at school level to pursue higher education at college levels.

• Developed complex Executive Summary report, which pertains to SAT and PSAT exams.

• Complex Business Process loading implementation using Oracle Stored Packages, Procedures & Functions.

• Extracted the data from Oracle, Flat files and loaded into Data warehouse.

• Interpreted the requirements and built Logical and Physical Relational Data Models.

• Prepared the PL/SQL Requirements and Design documents.

• Write and maintain SQL and PL/SQL stored program units/packages and maintain coding standards.

• Ensured product quality and timeliness of work, provided advice and guidance, resolved problems to meet objectives and provided periodic performance reports.

• Decomposed Release backlog items into the software tasks based on the desired software architecture necessary to create the desired functionality.

Environment: Oracle 11g2, MS SQL Server 2008, Actuate 11/8SP1 Fix10, Actuate eSpreadSheet, Linux, Rational ClearCase, Rational ClearQuest

July 2008 – June 2010 (Full Time) Team Lead

Basel II Project

State Street Corporation, Boston, MA

In 1988, the Basel Committee on Banking Supervision, comprised of representatives from central banks and regulatory authorities from many countries, introduced capital adequacy requirements for the international banking industry. The 1988 Basel Capital Accord, commonly referred to as “Basel I” was designed in response to concerns about the potential for major financial losses, which could threaten the solvency of major banks and impact the global financial system.

A new regulatory capital framework, known aptly as Basel II, was finalized and implemented in stages beginning from January 2007. In the U.S., regulators have proposed that only the country’s largest, internationally- active institutions, defined as “core” banks, be required to comply with Basel II. State Street was notified to adopt the most advanced and complex approaches for assessing risk and evaluating capital adequacy. Non-U.S. regulatory guidelines were evaluated in every country where State Street operated, and compliance programs consistent with the country-specific regulatory requirements were developed.

Responsibilities

• Involved in Oracle PL/SQL coding of stored procedures, functions and triggers.

• Core member in Unit testing and integrated testing and in the preparation of Test Plan Test Cases.

• Used Informatica Designer to create complex mappings using different transformations like filter, lookups, stored procedure, joiner, update strategy, expressions and aggregator transformations to move data to a Data Warehouse.

• Scheduled the Sessions and Mapplets using Server Manager to load the data into Target Oracle database.

• Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.

• Developed simple and complex Actuate reports across all the streams of project.

• Developed complex BIRT reports for the Basel II Capital Allocation System component and Limit Management System (a product customized on Fermat tool).

• Wrote the ClearCase scripts for the automated build process for Actuate reports.

• Used information object designer to develop information object in order to populate the report parameters dynamically from database.

Environment: Oracle 10g2/9i2, Oracle PL/SQL, Informatica Powercenter 7.1, Actuate 9SP3/8SP1, Actuate eSpreadSheet, Actuate BIRT 2.2/2.3, Browser Scripting Control, SQL Navigator, Linux, Rational ClearCase, Rational ClearQuest.

May 2007 – July 2008 Team Lead

FDW Project (Integrated Financial Services)

State Street Corporation, Boston, MA

FDW known as Financial Data Warehouse is State Street’s reporting infrastructure at company level for consolidated reporting as well as regional financial reporting. It acts as a consistent integration point for financial data and facilitates standard reporting and analytical capabilities and hence is the official book of records for State Street Corporation. The Financial Data Warehouse is a subject-oriented, integrated collection of financial data to support decision making by management. It holds the financial events and summary positions from FCS and FinApps with the daily/monthly balances. FDW provides daily reporting needs for the general ledger system.

FDW Actuate Reports is web-based reporting application that produces fixed format financial reports which can be printed or exported in variety of formats. Actuate reports provide access to Balance Sheet, Income Statement, Account Receivable, Accounts Payable, Daily Account Activity, Consolidated PnL data from FDW.

It is a reporting and decision support system with data extracted from major sources systems (Oracle Applications, FCS and OFSA) and organized in a structure that is more intuitive for querying and reporting.

The objective of FDW Reports is to represent a financial reporting infrastructure that centralizes access to information, data, applications, reporting and analytical systems for the required financial content.

Responsibilities

Fine tuned the existing complex SQL and PL/SQL Packages for solving critical production issues and bugs in FDW.

Minimized the execution time for the Complex Actuate reports by tuning the SQL, stored procedures, functions and packages relative to the reports.

Fine tuned the SQL Queries and PL/SQL Packages for the performance enhancement of the Actuate Reports in Production.

Wrote stored procedures and packages, pertaining to the report development.

Fine tuned the complex SQL Queries for the Certification Project, another stand-alone application under IFS project.

Implemented the VPD Security feature of Oracle in the Actuate reports in order to implement the security policy for protecting the critical, highly important Financial Data.

Developed complex actuate reports keeping in view the huge Data Ware Housing functionality and requirements.

Modified and created reports for IFIN process (IBT bank takeover by StateStreet).

Upgraded the Old existing Actuate report to brand new report with added features & functionality and minimized the report's execution time.

Wrote White paper on Row Level Security using VPD in Actuate Reports.

Environment: Oracle 10g2/9i2, Oracle PL/SQL, Linux, Actuate 9SP3/8SP2, Discoverer 9.0.4, Embarcadero Rapid SQL 7.3.0.

February 2006 – May 2007 Lead Developer

GSA Project (Global Services Application)

State Street Corporation, Westwood, MA

The Application Processing & Solutions Unit (APSU) in State Street maintains and supports around 38 critical applications on various platforms. Many of these applications were on versions not supported by the respective vendors and are not compliant with the security requirements standardized by the State Street Corporation.

Upon review of the applications, it was determined to re-platform or re-write the applications from FoxPro, Visual Basic to J2EE (Springs Framework) and migrate the databases from SQL Server, MS Access to Oracle 10g2.

Responsibilities:

Created, deployed, maintained and supported database objects creation, deployment and compilation of the same.

Wrote stored procedures, packages, functions, triggers and tuned the same for better performance on Oracle Database.

Wrote automated database scripts for Unix/Linux environments for executing DML and DDL statements for Database environments.

Wrote SQL*Loader scripts load the data from Microsoft SQL Server in form of text files to Oracle Database tables.

Designed the architecture of actuate reports in terms of user requirements, business standards and functionality according to State Street Corporation standards.

Designed and developed simple, inter-mediate and complex Actuate reports for eight separate applications.

Implemented the password matrix program for Actuate reports, as per State Street Security policy, to retrieve the database user name and password dynamically based on the user inputs.

Wrote the XML scripts, shell scripts for uploading the reports on Actuate Management Consoles.

Environment: Oracle 10g2, MS SQL Server 2000, Actuate 8, JSP, Spring 1.2, Linux, WebSphere Application Server 6.0

Oct 2005- February 2006 BI Developer

FedEx Services, Memphis, Tennessee

FedEx produces superior financial returns by high-value added supply chain, transportation, business and related information services through focused operating companies. The EIGS International Sales Performance Dashboard is part of the Express International Growth strategy. The International Sales Performance Dashboard is to provide high level/summary Dashboard (reporting) information to the Sales groups, who are part of higher management level of personnel.

This Dashboard improves executive decision-making and provides a channel of communication in all regions. The Dashboard intends to help trigger the desired sales behavior among sales professionals by providing timely information.

Responsibilities:

Involved in design and development of International Dashboard

Clear Understanding of FedEx Business Process.

Responsible for Application DBA Activities and tuning of SQL Queries using Explain Plan, SQL trace, Stats pack and setting up standby databases.

Developed reports for modules like Express and Freights, Revenue Performance, Sales Performance and Sales Productivity.

Wrote stored procedures and functions, analyzed and tuned the same for better performance.

Developing new and refining existing process to enhance quality and productivity.

Environment: Oracle 9i, Teradata 7.0, Windows 2003, Actuate 8.0 e-reporting suite, UNIX, web sphere.

Apr 2005 – Jun 2005 BI Developer

CADDIS, Sacramento, USA

CADDIS (California Development Disabilities Information System)

ShareCare is the software application that integrates all the 21 Regional Centers across California, which provides services, medical care as well as personal attention to mentally as well as physically disabled persons.

The important modules consist of Fiscal, Consumer, Provider, Trust, Admin etc. The application is developed using ColdFusion as presentation layer, Iplanet as application server, Oracle 9i as resource manager and Actuate 6.0 as reporting tool. The Department of Developmental Services (DDS) is committed to providing leadership that result in quality services to the people of California and assures the opportunity for individuals with developmental disabilities to exercise their right to make choices.

Responsibilities:

Fully involved in the design and implementation of CADDIS Reports from simple to complex.

Development of customized reports as per the business requirements.

Worked on domains like TRUST, FISCAL, PROVIDER And Provider Of Services etc. in CADDIS.

Created Hand-coded queries, dynamic queries to be utilized by Actuate Reporting Suite.

Created Dynamic Grouping and Sorting of the Reports.

Deployment of the reports onto the Actuate Report Server.

Overridden various reporting methods like OnRead, Fetch, Start, ObtainSelect, and OnRow to get the right functionality.

Worked on Sequential reports, Cross-tab reports, Conditional Reports.

Wrote stored procedures and dynamic sql, analyzed and tuned it for better performance.

Environment: Oracle 9i, TOAD, SQL*Plus, Windows NT (client), IBM-AIX, Actuate 6.0 e-reporting suite.



Contact this candidate