Kuldeep Mandloi
Celina, TX,***** +1-612-***-**** *******.*******@*****.*** www.linkedin.com/in/kuldeep-mandloi
Profile
**-**** *********** *** *******-driven Data Engineer with primary expertise in PL/SQL development, I specialize in designing and optimizing complex database queries, stored procedures, and functions to support efficient data processing and reporting. With extensive experience in PL/SQL, I have a proven track record of enhancing database performance, streamlining data workflows, and ensuring data accuracy across enterprise systems. Additionally, I possess secondary skills in PySpark, Snowflake, Apache Airflow, and Informatica, which enable me to build and manage scalable ETL pipelines, orchestrate data workflows, and leverage AWS cloud platforms for distributed data processing and storage. I am passionate about combining my deep SQL knowledge with modern big data and cloud technologies to create robust, automated data pipelines that drive business insights and support data-driven decision-making.
Skills & Abilities
1. Database & Programming Languages:
oOracle SQL and PL/SQL
oUnix/Shell Script (Run Java Daemon/store proc in batch)
oPython
oMSSQL Server T-SQL
2. Data & Analytics:
oExperience with OLTP and OLAP data processing
oData Modeling
oETL Processes (Informatica, PL/SQL, Snowflake)
oData Migration
oPerformance Tuning & Query Optimization
oMachine Learning & Data Science (Training in Python and Big Data)
oBig Data Framework & HDFS
oPySpark
3. DepOps Tools:
oGIT
oBit Bucket
oVSS, Tortoise SVN
4. Job Scheduling & Monitoring:
oAutoSys
oControl-M
5. DB Development tools:
oToad
oPLSQL Developer
6. Data Reporting & Visualization:
oIBM Cognos
oCrystal Reports (9.0/10.0)
oPower BI
7. IDEs & Development Tools:
oPowerBuilder (8.0/9.0/10.0)
oMS Visual Basic
Certifications
·Oracle Cloud Infrastructure Foundation (OCI) Certified
·AWS Certified Cloud Practitioner
·Oracle Certified Associate (OCA)
·Master of Data Science (Simplilearn)
Experience
CONNECTEDX INC. 03/2021-PRESENT
Clients – Toyota Motors NA (Mar-2021-May-2021),
Anthem (Jun-2021-Jan2023)
AIG/Corebridge Financials (Feb-2023 - Present)
Global Capital Market – IT Separation project
Role: PLSQL Developer specializing in ETL processes
Primary Responsibility:
oDesign, develop, and optimize PLSQL-based ETL workflows
oSupport Extract, transform, and load data into downstream applications or data warehouses
Key Tasks:
oWrite complex PLSQL queries and stored procedures to extract data from multiple sources
oApply business logic for data transformation
oEnsure data integrity throughout the ETL process
oOptimize and automate data loading processes
oManage and automate Autosys job schedules for Data Load and Report Extraction with Shell/Unix Script
Performance:
oTune queries and transformations for efficient handling of large datasets
oBuild reusable Oracle pipeline functions to generate data extracts in required formats
oEnsure timely and accurate data delivery to reporting or transactional applications (e.g., Kyriba, Bloomberg, Refinitiv)
Collaboration:
oWork with Traders and business analysts to define requirements
oContinuously improve data processing and delivery performance
Error Handling and Monitoring:
oImplement error handling, logging, and monitoring for Autosys jobs
oEnsure seamless data flow across systems
JAVAH ETL Redesign
Project Overview: The JAVAH system pulls data related to the Trader’s activities of booking trades from various sources
Role: Developed and managed an automated ETL pipeline using PySpark
Data Sources Involved: CSV, JSON, Parquet, JDBC (MySQL), S3
Responsibilities:
oDesign data ingestion workflows for both batch and real-time data
oApply transformations like Filtering, Aggregation, and Joins for data cleaning and enrichment
Automation & Scheduling:
oImplement automated job scheduling using Apache Airflow
oEnable real-time analytics and periodic reporting with scheduled data loads and processing tasks
Data Integrity & Reliability:
oIntegrate data validation and error-handling mechanisms
oEnsure data integrity and reliability across all sources
Outcome:
oSignificantly improved operational efficiency
COGNIZANT TECHNOLOGY & SOLUTIONS - 09/2007 TO 09/2019
Client-Wellington Management Company, AutoDesk, Ameriprise Financials, JP Morgan Chase, CVS
Key Projects:
oBasel Credit Rules for Banks:
Developed PL/SQL procedures to calculate credit risk exposure
Calculated risk-weighted assets for Basel III compliance
oCorporate Quick Pay System:
Automated payment workflows using PL/SQL and shell scripts
Validated transactions and optimized payment processing
Integrated with external payment systems
oView Portfolio Reporting System:
Developed PL/SQL queries and reports (Cognos) for real-time investment portfolio insights
Utilized shell scripting to automate data extraction and job scheduling
Automation & Job Scheduling:
oImplemented job scheduling using Control-M and Autosys for automating database jobs with background support of Shell/Unix script
oOptimized system performance and handling large datasets
oEnsured timely execution of ETL processes and report generation
Data Integrity & Error Handling:
oEnsured data integrity across all processes
oImplemented error-handling mechanisms
Reporting:
oWorked on reporting tools (Power BI, Cognos) to support financial projects by creating basic dashboards and visualizations for budgeting, financial reporting, and trend analysis. My work focused on organizing data, building simple reports, and ensuring clear financial insights for stakeholders.
Collaboration:
oWorked with cross-functional teams to align solutions with business requirements
TATA CONSULTANCY SERVICES LTD - 02/2007 TO 09/2007
Client- ABN AMRO Bank
Project Overview: Designed and implemented a prototype for migrating data from Paradox files to RDBMS (SQL Server, Oracle, etc.)
Process Involvement: Led the entire process through the SDLC cycle, from requirements gathering to deployment
Key Responsibilities:
oData Migration Strategy:
Designed an efficient migration strategy
Connected PowerBuilder to Paradox file using ODBC
oDevelopment:
Developed PowerBuilder DataWindow objects to extract, transform, and load data into the RDBMS
Ensured data consistency and integrity throughout the process
oTesting:
Performed unit, system, and performance tests to ensure smooth data transfer and minimal downtime
oPost-Deployment:
Provided ongoing support and optimizations to meet business requirements and data integrity standards
Outcome: Enabled seamless integration of legacy Paradox data into a modern database environment
FINTECH SOLUTIONS PVT LTD. - 05/2004 TO 01/2007
Project: EasySoft, a comprehensive software solution for SMEs
Key Responsibilities:
oContributed to both new application development and production support
oInterface Design: Designed user interfaces for seamless interaction with other applications
oCoding & Testing: Developed and tested new features to ensure smooth system integration
oFeature Integration: Integrated new features within the existing system to maintain consistency
oEnhancements & Maintenance: Handled feature enhancements, maintenance tasks, and bug fixes
Crystal Reports: Designed reports for business intelligence needs
Database Development:
oDeveloped stored procedures, functions, and cursors for MS SQL SERVER 2000
support application functionality
Database Standardization:
oUtilized Redgate SQL Compare to standardize client databases
oPreserved custom changes while replicating standardized updates
oEnsured database consistency and integrity across multiple client environments
o
Education
Master of Computer Management (MCM) 2001-2003
Devi Ahilya University, Indore
Computer Science and Financial Management
Master of Computer Applications (MCA) 2005-2009
ICFAI Tripura Distant Education Course
Computer Application Concept Building subjects
Data Structures, OOPS, Web Application components, DBMS
Bachelor of Science (B. Sc.) 1996-1999
Devi Ahilya University, Indore
Computer Science and Computer Maintenance