Padmavathy Thavaraj (Padma)
Contact: +1-848-***-**** e-mail: ********@*****.***
Summary of qualification
Versatile Database Professional and U.S. Citizen with full work authorization and eligibility for federal contracts and clearance. Expertise in designing, implementing, and optimizing scalable database solutions across relational, NoSQL, and vector-based systems across hybrid and cloud environments. Proven ability to architect robust data infrastructures, integrate AI-driven search capabilities, and ensure high availability in production environments. Skilled in root-cause analysis and performance troubleshooting of databases by identifying query bottlenecks, locking issues, configuration errors, and system failures, ensuring stability and availability.
Professional Experience
IRS, Maryland, USA https://www.irs.gov/ (Feb 2022 – Sep 2025)
Data Platform Specialist - Contractor
Architected and executed data modernization strategies for IRIS and IVES, implementing policies and procedures that enhanced data integrity, scalability, and compliance for the federal government of the United States.
Responsibilities:
●Architected and delivered a production-grade database re-architecture aligned with cloud architecture standards, incorporating optimized logical and physical schema designs, automated migration pipelines, backward-compatible, and resilient rollback strategies to ensure high availability and operational continuity.
●Performed comprehensive performance and capacity benchmarks on workloads exceeding 5M daily transactions, achieving a 30% reduction in downtime and a 25% decrease in operational risk through a validated migration playbook and deployment governance framework.
●Developed a generative AI–powered application enabling real-time anomaly detection and contextual alerting, event correlation, and predictive analytics to strengthen fraud detection accuracy by 40% and proactive observability strategies.
●Led infrastructure and capacity planning initiatives for replica sets and sharded clusters across hybrid cloud and multi-region deployments, enabling 99.99% availability and scaling to support 10TB+ data growth annually.
●Developed functions, stored procedures, triggers, materialized views, and user-defined functions to handle complex business logic at the database layer.
●Experienced in establishing and operationalizing governance and compliance frameworks to ensure regulatory adherence, risk mitigation, and secure, scalable cloud and data platforms.
●Automated provisioning, configuration, and maintenance of database environments using Terraform, Python, and Shell scripts, integrated with CI/CD pipelines for repeatable, reliable deployments.
●Implemented database observability stack with Prometheus and Grafana, creating real-time dashboards for query performance, replication lag, and resource utilization enabling accurate capacity planning and SLA adherence.
●Architected a cost-optimized storage solution leveraging Amazon S3’s tiered storage and metadata indexing in database, reducing long-term storage expenses by 40% while maintaining millisecond-level object reference retrieval.
Environment: Oracle 12.c, MongoDB 6.0, PostgreSQL 12.0, AWS RDS, CloudFormation, Terraform, OpenAI GPT (on-prem/FedRAMP), LangChain, RAG pipelines, vector embeddings, AWS EC2, S3, EKS, Mongo Shell, pgAdmin, pgVector, JavaScript, Python 3.x, Red Hat Linux 8.0, Bash shell script, Prometheus v2, Grafana v10
Charles Schwab Corp., West Lake, TX https://www.schwab.com/ (May 2021 - Feb 2022)
Principal Data Architect
Led data migration initiatives for the TD Ameritrade to Charles Schwab integration project, ensuring secure, compliant, and accurate transfer of sensitive financial data with zero disruption to business operations.
Responsibilities:
●Implemented migration playbook activities including schema mapping, transformation scripts, and testing, enabling a seamless transition of legacy systems.
●Designed and implemented CQRS patterns that decoupled read/write operations, reducing query latency by 40%, enhancing system scalability, and supporting real-time analytics across mission-critical enterprise platforms.
●Integrated Bash/Python scripts into CI/CD pipelines, enabling repeatable migration validation runs and achieving 100% compliance with internal migration SLAs.
●Implemented materialized views to support near–real-time analytics on datasets exceeding 5TB, improving query performance by 50% and reducing system load on OLTP databases.
●Worked on building data federation pipelines to combine and transform data across multi-data sources and these datasets provide workload isolation for long-running analytic queries.
Environment: SQL Server Management Studio 2017, MongoDB 5.0, JavaScript, Python 3.x, Confluent Kafka, Red Hat Linux 8.0, Bash Shell script, Windows 11, Virtual Machine
Citi’s Global Consumer Bank, Texas https://www.citigroup.com (July 2019 - April 2021)
Principal Data Architect
The Customer Hub team is responsible for building a consolidated repository of customer and account relationships supporting a customer centric view. This centralized Master Customer Data enables holistic, consistent, real-time view of customer base data as an authoritative view across all lines of business.
Responsibilities:
●Engineered NoSQL data model from Oracle MDM leveraging Hackolade data modeling tool, with optimizations (regrouping, flattening, redundancy removal) to meet performance SLAs. Delivered scalable materialized views to support analytical and graph-based use cases.
●Analyzed replication latency and lag for Oracle GoldenGate and MongoDB replication, identifying bottlenecks and tuning configurations to ensure near real-time data synchronization across distributed systems.
●Leveraged Striim to analyze and monitor replication performance across multiple databases, including Oracle GoldenGate and MongoDB, by capturing real-time Change Data Capture (CDC) streams.
●Optimized database performance through proactive tuning, capacity planning, and benchmarking, improving query efficiency and overall system reliability for high-volume workloads.
●Developed a data-driven strategy to increase revenue by analyzing customer behavior and feedback using Amazon Athena – for ad-hoc SQL queries on large datasets, enabling fast insights.
●Architected a governed data lake leveraging AWS Kinesis for ingestion, DynamoDB for control metadata and checkpoints, and AWS Glue for cataloging, schema enforcement, and compliant data transformations.
●Collaborated with security, data, and platform teams to enforce cloud governance and operational controls.
Environment: Oracle 12.C, AWS Kinesis, AWS DynamoDB, AWS Glue, AWS Athena, Ab Initio, Hackolade 3.4, Python 3.x, Java 8.0, Confluent Kafka, Striim, Spring Boot framework, Red Hat Linux 8. 0, Windows 10
SunGard’s intelliSUITE of solutions, Parsippany, NJ. https://www.fisglobal.com (July 2003 – Dec 2017)
FIS (formerly SunGard Data Systems)
Senior Software Engineer
SunGard’s banking solution automates the transaction lifecycle for financial services institutions. Composed of best-of-breed business solutions for matching & reconciliation, exception processing, customer self-service, and liquidity management, the intelliSUITE of solutions promotes operational efficiencies that drive STP and helps lower operational risk and costs.
Responsibilities:
●Participated in the major parts of the product life cycle from analyzing the requirements, prototype, design, development, testing, performance study and establishing the product demo to clients of intelliSUITE products.
●Developed SQL Test to enforce best practices for database development and executed static analysis tests.
●Worked on data analysis to compile, analyze, validate, and model the data to enhance the business operation performance and increase the high availability of the same database for both web application and desktop application. All the database scripts were implemented/tested/ released for every upgrade of the product.
●Maintained the audit log for all banking transactions.
●Managed the quality assurance of application and streamlined QA process to increase efficiency and reduce new product roll-out time. Established defect RCA process using 5 whys technique to prevent production defects and delivered projects with high quality.
●Developed independently all the COM Components and all the reports in the report’s module using Crystal Reports/Business Objects. Designed and maintained all the transaction templates between the middle-tier and the front end using XSD/XML.
Environment: Visual Studio .Net, VC++, TFS, COM+, C#, Crystal Reports .Net, XML, XSL, SQL Server Management Studio 2005/2012/2016, JSON, QTP, SQL Test, Bugzilla
Professional Expertise
IDE: IntelliJ IDEA, Eclipse, Visual Studio 2012/2019, Anaconda Navigator, JetBrains PyCharm
RDBMS: Oracle 12.c, SQL Server Management Studio 2000-2017, DB2
NOSQL: MongoDB 3.4-7.0, PostgreSQL 12.0, AWS Dynamo DB
Generative AI Tools & Frameworks: AWS Bedrock, LangChain, LlamaIndex, OpenAI API
LLMs & Fine-Tuning: OpenAI GPT, LLaMA
LANGUAGES: C, C++, C#, Java, Python
ENVIRONMENTS: Windows 10, CentoS, Linux, Virtual Servers, Virtual Machine, AWS
Educational Qualification:
B. S. Computer Science