Post Job Free
Sign in

PL/SQL Developer

Location:
Jersey City, NJ
Salary:
$60/hour
Posted:
June 16, 2025

Contact this candidate

Resume:

Priyadharshini Manikandan

PL/SQL & Java Backend Developer

Email : *******************@*****.***

Phone : +1-551-***-****

Location : New Jersey, USA

PL/SQL and Java Backend Developer with 8+ years of experience in database development, backend service engineering, business application analysis, and business intelligence solutions. Adept at designing and optimizing stored procedures, implementing microservices, and delivering enterprise-level reporting and data visualization.

• PL/SQL Expertise (5+ years): Strong in writing stored procedures, functions, triggers, and packages using Oracle PL/SQL. Skilled in query optimization, performance tuning (SGA/PGA), analytical functions, partitioning, and using dynamic SQL with ref cursors. Experienced in handling JSON formats and Oracle distributed query performance tuning

(EXPLAIN_PLAN, AUTOTRACE).

• Java Backend Development (2+ years): Proficient in building scalable RESTful APIs using Core Java and Spring Boot. Hands-on experience in JDBC, multithreading, exception handling, and integrating backend services with Oracle and SQL Server. Worked with JSON/XML parsing, Maven, Git, Jenkins, and CI/CD pipelines.

• Power BI Development (3+ years): Expertise in building interactive dashboards, modeling data, writing DAX queries, and optimizing Power BI performance. Experienced in advanced Power Query transformations (merge, append, unpivot), and publishing dashboards to Power BI Service.

• ETL and Data Integration: Experience in Informatica PowerCenter for designing and managing ETL workflows and data transformation. Automated SQL-based ETL workflows and integrated with Power BI for real-time analytics.

• Cloud & Modern Data Stack: Familiar with Snowflake for cloud data warehousing, writing optimized SQL queries, managing virtual warehouses, and integrating structured/semi-structured data. Hands-on experience with AWS and Oracle Cloud environments.

• Python & ML (2+ years): Proficient in data analysis and predictive modeling using Python (NumPy, Pandas, Matplotlib, Seaborn, Scikit-learn, XG Boost). Built ML models for forecasting, attrition, and fraud detection with 90%+ accuracy. Developed Gen AI-based applications using RAG methods.

• Agile & Collaboration: Strong exposure to Agile methodologies, stakeholder collaboration, cross-functional teamwork, and mentorship. Known for delivering value-driven solutions across financial and operational domains. EDUCATION

• Masters in Science, Stream - Medical Bioinformatics Sri Ramachandra Medical College, Chennai, India 2013 - 2015

• Bachelors in Science, Stream - Bio-Technoloy Prince Sri Venkateshwara College, Chennai, India 2010 - 2013 SKILLS:

Skills/Tools Tools

Programming Java (Spring Boot) Python

Databases & Query

Optimization

Oracle PL/SQL MySQL Stored procedures Indexing Tuning SQL queries Partioning

Data Analytics & BI Power BI (DAX Data Modeling) Tableau Excel – Expertise in data storytelling statistical analysis Building insightful dashboards. Automation & Process Power Automate CI/CD Pipelines – Experience in automating workflows Data Analysis &

Visualization

Power BI Tableau Matplotlib Seaborn Excel

Cloud Platforms AWS S3 Athena Kinesis

Data Engineering Feature Engineering Dimensionality Reduction Data Preprocessing – Applied techniques to clean structure Enhance datasets ETL & Data Integration SQL-based ETL Informatica PowerCenter – Designed workflows for data extraction transformation Loading into data warehouses

Web Development Django Flask Fast API – Built scalable web applications and REST APIs for data interaction and model serving.

AI/ML Frameworks TensorFlow Scikit-learn XG Boost – Developed and tuned classification Regression Forecasting models.

LLMs & Gen AI LangChain LangGraph llamaIndex RAG Agentic AI Prompt Engineering NLP Sentiment Analysis Topic Modeling Summarization Q&A – Developed NLP pipelines for business text analysis and conversation AI.

ML Ops Docker Kubernetes CI/CD Pipelines – Automated model deployment and lifecycle management.

Vector Databases Chroma DB FAISS Pinecone Haystack – Integrated vector search engines for semantic retrieval in AI-driven apps.

Performance Metrics ROC-AUC Precision-Recall Cross-Validation – Used evaluation metrics to validate and compare ML model performance.

WORK EXPERIENCE

Novalink Solutions, New Jersey, USA – PL/SQL And Java Developer 12/2022 – Present Client: Fiserv

Database Development & PL/SQL Programming

• Designed and optimized PL/SQL stored procedures, functions, and indexes, improving query performance by 40% through SGA and PGA tuning strategies.

• Created and managed tables, views, materialized views, and database triggers to ensure data integrity and audit compliance.

• Developed complex SQL queries, correlated subqueries, and dynamic SQL for business applications.

• Extensively used advanced PL/SQL features such as Records, Tables, Object Types, Ref Cursors, and Dynamic SQL.

• Performed database performance tuning using optimizer hints such as MATERIALIZE, PARALLEL, and NONPARALLEL.

• Implemented robust error handling using PL/SQL Exception Handling for debugging and application stability.

• Developed and maintained Oracle Forms for transactional modules and administrative workflows.

• Integrated Oracle Forms with backend PL/SQL logic and triggers for real-time form validation and processing. Backend Development with Java

• Designed and developed scalable Java backend services using Core Java, Spring Boot, and RESTful APIs for financial transaction processing.

• Built microservice components for authentication, user management, and transaction logging aligned with business logic.

• Integrated Java services with Oracle DB using JPA/Hibernate, optimizing query performance and resource usage.

• Performed unit and integration testing using Postman to validate and ensure API reliability.

• Maintained CI/CD pipelines using Jenkins and Maven for automated deployment and version control. Frontend Integration using AngularJS

• Collaborated with AngularJS front-end team to build dynamic UI components linked with backend APIs.

• Developed reusable AngularJS components for forms, filters, and data visualization elements like tables and charts.

• Managed state, routing, and asynchronous data calls using AngularJS, improving overall UX. ETL & Data Integration

• Designed and developed ETL workflows using Informatica PowerCenter to integrate data from diverse sources into Oracle and SQL Server.

• Scheduled and monitored ETL jobs to ensure timely data availability and high-quality reporting.

• Automated SQL-based ETL workflows connecting Oracle. Cloud Data Warehousing with Snowflake

• Created and managed Snowflake schemas, tables, and pipelines to support scalable cloud analytics.

• Implemented RBAC (Role-Based Access Control) and tuned queries in Snowflake for secure and efficient data operations.

• Integrated Snowflake with Power BI via Snow SQL, file formats, and external stages for real-time visual reporting. Business Intelligence & Reporting (Power BI)

• Designed and published interactive Power BI dashboards to track financial KPIs, employee performance, and retention trends.

• Created data models, calculated measures, and optimized DAX queries for high-performing visualizations.

• Built complex Power BI reports with pie, bar, gauge, donut, waterfall charts, slicers, and filters.

• Performed data blending from SQL Server, Excel, CSV, and text sources using Power Query.

• Conducted advanced transformations including merging, pivoting, and conditional columns in Power Query.

• Published reports to Power BI Services (Pro) with scheduled refresh, sharing, and subscriptions. Scripting & Data Analysis

• Developed and maintained Python scripts for ETL and data transformation using NumPy and Pandas.

• Collaborated with cross-functional teams for backend-frontend integrations and performance improvements. Tools & Technologies:

Oracle 19c, Oracle 12c, Oracle SQL Developer, Toad, MySQL Workbench Core Java, Spring Boot, REST API, JPA, Hibernate Jenkins, Maven, Git Eclipse, IntelliJ IDEA AngularJS Power BI, DAX, Power Query Informatica PowerCenter Snowflake Python, NumPy, Pandas

BitMind Solutions, India – PL/SQL Developer/BI Analyst 07/2018 – 08/2022 Client: South Asian Banks through D&B

• Designed and managed relational databases using PL/SQL, creating tables, views, indexes, stored procedures, and triggers.

• Developed stored procedures, functions, triggers, and packages to support business applications.

• Developed Java-based backend modules to interact with Oracle databases using JDBC, enhancing integration between UI dashboards and PL/SQL procedures.

• Created and managed materialized views and synonyms with invoker’s privileges to optimize data access.

• Ensured data integrity and audit compliance through structured database design.

• Built RESTful APIs using Java and Spring Boot to support real-time reporting and analytics solutions for internal dashboards and BIR generation workflows.

• Applied object-oriented design principles and exception handling in Java applications for data processing and automated business logic execution.

• Created utility classes in Java to automate SQL query execution and streamline recurring data extraction/reporting tasks.

• Developed and optimized complex SQL queries, correlated subqueries, and dynamic SQL for backend testing, data validation, and report generation.

• Improved query execution time by 15% through indexing, query optimization, and performance tuning techniques.

• Used Bulk Collections and FORALL to enhance query performance and reduce context switching between SQL and PL/SQL engines.

• Tuned PL/SQL packages and SQL queries using EXPLAIN PLAN, SQLTRACE, and optimizer hints for execution.

• Implemented exception handling for robust error management, debugging, and application stability.

• Designed, developed, and validated ETL processes for data extraction, transformation, and loading (ETL).

• Used SQL*Loader to load data from text files and spreadsheets into Oracle databases.

• Created and optimized indexes on tables for faster data retrieval, enhancing database performance.

• Extensively used optimizer hints to direct execution plans for optimal performance.

• Tuned PL/SQL packages and SQL queries, analyzing execution plans and creating indexes as needed.

• Provided database support for financial and operational reporting, ensuring data integrity.

• Developed and debugged PL/SQL code, creating, executing, and optimizing SQL queries.

• Supported production defect resolution and daily business transaction issues.

• Extensively used advanced PL/SQL features such as Records, Collections, Object Types, and Dynamic SQL.

• Designed and developed interactive Power BI dashboards and reports tailored to business needs.

• Created data models in Power BI, cleaning, transforming, and merging data from multiple sources (SQL Server, MS Access, Excel, CSV, and text files).

• Built data visualizations using DAX functions, including table functions, aggregations, and iteration functions.

• Utilized Power Query for advanced data transformation, including merging, appending, unpivoting, and conditional columns.

• Developed reports using bar charts, pie charts, donut charts, gauge charts, waterfall charts, maps, filters, and slicers.

• Migrated and optimized existing BIR reports into Power BI based on user requests.

• Conducted data analysis to uncover business insights, improving decision-making efficiency.

• Visualized client data using Tableau, Matplotlib, and Seaborn for better business understanding.

• Assisted in developing machine learning models for predictive analytics and AI-driven business intelligence.

• Processed BIR (Business Information Report) requests for South Asian clients, collaborating with D&B sister concerns.

• Regularly interacted with user groups to discuss requirements, updates, and performance improvements.

• Collaborated with stakeholders to refine data visualization strategies and improve reporting processes.

• Maintained daily MIS reports, trackers, and revenue reports for the regional sales manager.

• Trained and mentored a team of five analysts, enhancing SQL querying, Power BI usage, and data visualization skills.

• Provided guidance on data analytics best practices, leading to a 10% increase in team productivity.

• Onboarded and trained new team members, enabling them to become active contributors.

• Managed cross-functional collaboration with 57 bankers across the South Asian region. Tools: Oracle 12c, Oracle 11g, Oracle SQL Developer, Toad, MySQL Workbench, Core Java, Spring Boot, Maven, Git, Jenkins, Power BI, Tableau, Excel, Jupyter

Dun & Bradstreet, India – Data Analyst 06/2017 – 06/2018 Client: South Asian Banks

• Collaborated with multiple teams across the processes and worked in the South Asia Client service team where we managed 57 bankers across the South Asian regime.

• Key role is to Process the client’s request on BIR report (Product of D&B) Where we coordinate with various D&B sister concerns across the globe and deliver the reports.

• Research and investigation of data. Handling data from different file formats like CSV, Excel, JSON, Parquet and MySQL tables

• Analyze different data formats using Python and Excel.

• Visualization of data collected using Tableau, Matplotlib, and Seaborn in Python for better insight into Clients.

• Performed data validations and data references in the database in MySQL.

• As a member of that team, I have been training the new peers and making them an active contributor to the team.

• Allowed to prepare daily MIS, tracker, and revenue report to the regional sales manager. Tools: MySQL, MySQL Workbench, Power BI, Excel.

CERTIFICATION:

• Base SAS 9 AWS Machine Learning Associate Data Analytics using ML & AI Python for Data Science REFERENCES:

• GitHub - https://github.com/PriyadharshiniManikanadan

• LinkedIn - https://www.linkedin.com/in/priyadharshini-manikandan/



Contact this candidate