Post Job Free
Sign in

Program Manager Test

Location:
Edison, NJ
Posted:
August 01, 2025

Contact this candidate

Resume:

.

.

s

a

l

e

s

f

o

r

c

e

Seema Rajaram

Global Program Manager – Digital Platforms ( Test, Validation, Compliance, Automation & AI Innovation) Edison, New Jersey - USA 848-***-**** ************@*****.*** LinkedIn Summary

Strategic Global Program Manager specializing in digital platforms for Pharma, Healthcare, Biotech and BFSI, with deep expertise in test management, validation, intelligent automation, and AI innovation. Proven success leading enterprise- scale digital transformation programs across GxP-regulated environments, accelerating compliance, quality, and operational efficiency. Adept at building global Centers of Excellence, integrating emerging technologies, and driving cross-functional alignment to deliver scalable, audit-ready solutions that support innovation and regulatory readiness across the pharmaceutical value chain.

Core Competencies

• Digital transformation Test strategy, Planning & execution

• End-to-end Test Program and portfolio management across regulated domains and digital platforms

• Product Visioning, Roadmap, Backlog Management

• Stakeholder engagement & change management

• Cross-functional global team leadership across QA, R&D, Regulatory, and IT, Agile product delivery management

• End-to-end test lifecycle management (SIT, UAT,

Regression, Performance, Security, Automation)

• AI/ML model validation including bias, drift, and explainability, Risk-based testing & defect triage

• Metrics-driven reporting & KPI Monitoring

• Automation frameworks (Selenium, Tosca, Cypress, UFT)

• CI/CD integration, Risk Assessment & Prioritization

• COE Development & Mentorship (Test, Validation, AI)

• Agile Delivery Management( Scrum, Kanban, SAFe)

• Requirements elicitation, user story development, backlog priorities, sprint planning, sprint retrospective

• Data integrity, lineage, and reconciliation testing

• Big data platform validation (Spark, Hadoop, Snowflake)

• Data-driven automation & data analytics validation

• Business intelligence validation & dashboard delivery

• Validated System Lifecycle Expertise (GxP, 21 CFR Part 11, GDPR, GAMP 5,HIPAA, ISO, ICH compliance)

• CSA/CSV documentation, Inspection & audit readiness

• Global health authority alignment (FDA, EUDAMED)

• Compliance-Driven Automation (RPA/AI platforms)

• Ethical AI Governance (EU AI Act, NIST AI RMF &OECD)

• Interface validation(ERP, CRM, AWS, and SaaS platforms)

• Global Stakeholder Alignment and Communication

Professional Experience Summary

• Lead end-to-end test planning and execution across digital platforms, AI/ML systems, and enterprise applications supporting clinical, regulatory, manufacturing, and commercial functions. (Agile delivery)

• Align testing objectives with digital transformation goals and regulatory milestones (FDA, EUDAMED system readiness), ensuring AI models and digital tools meet quality and compliance standards.

• Chair test governance forums to drive decision-making, risk mitigation, and continuous improvement in AI validation and digital QA.

• Coordinate testing across R&D, Clinical Ops, Regulatory Affairs, CMC, and Commercial teams, ensuring AI-enabled systems are validated against real-world use cases.

• Manage external vendors, CROs, and digital partners to ensure deliverables meet AI validation protocols, data privacy standards, and regulatory expectations.

• Serve as the primary QA liaison for internal stakeholders and external auditors, ensuring transparency and audit readiness.

• Develop and maintain validation protocols, Validation Test Strategy, data integration strategy, test plans, traceability matrices, and validation/Test summary reports for digital platforms, cloud apps, and AI intelligent automation tools.

.

.

• Oversee End-to-end product test management that includes System Integration Testing (SIT), User Acceptance Testing (UAT), and AI model validation, leveraging automation frameworks and synthetic data generation.

• Identify and mitigate risks related to AI bias, data drift, model explainability, and compliance gaps using predictive analytics and risk-based testing.

• Ensure testing activities comply with GxP, 21 CFR Part 11, GDPR, HIPAA, and emerging AI governance frameworks.

• Manage test data and environments, tools, and resources across hybrid cloud ecosystems, integrating platforms like Selenium, Tosca, ValGenesis, JIRA/Xray, and Azure DevOps.

• Drive adoption of AI-assisted testing tools, RPA, and low-code/no-code platforms to accelerate validation cycles.

• Deliver real-time dashboards, test metrics, and AI performance insights to executive leadership and QA stakeholders.

• Capture lessons learned, promote knowledge transfer, and continuously refine test strategies to support evolving digital and AI landscapes.

Education and Training

• Stanford University - USA AI-Driven Leadership & Data Science 04/2025.

• Anna University - India Bachelor of Information Technology 04/2007. Applications and Tools Proficiency

• ERP Systems: SAP ERP, SAP HANA, Salesforce, PeopleSoft, Workday Cloud (HCM, Payroll), Oracle Fusion, Oracle CRM

• UDI & Regulatory Platforms: Single Source UDI, FDA, EUDAMED, Veeva Vault, Service Management: ServiceNow

• Integration Platforms/validation: MuleSoft, TIBCO, BODS, ESB, iPaaS, Informatica, Beyond Compare,PyCharm Prof.

• AI Platforms: IBM Watson X (AI,Data,Governance), NLP/ML, LLM,RAG, Gen AI, AI Agents & Chatbots, Figma(eTMF)

• API & Web Services: API (REST/SOAPUI/ReadyAPI, TestComplete), Informatica (DB/SQL), Informatica TDM, Infosphere

• Analytics & BI: OBIEE, Tableau, Query Surge, Power BI, Tableau, IBM Mainframes (COBOL, JCL, REXX,CICS,IBM DB2).

• Pharmacovigilance: Argus, Inform EDC, Clinical Operations: CTMS,CDMS,EHR,Patient DB,PDM,PEARL, Product DB

• Automation & Functional Testing: Selenium, Appli Tools, UFT (QTP), HP ALM/QC, Val Genesis,LoadRunner

• Test Orchestration: qTest, TFX, Test Data Management: Informatica TDM,IBM Inforsphere

• Mobile Testing: Mobile Center, LoadRunner, JMeter, and Appium).Robotic Process Automation: RPA Blue Prism

• Atlassian Suite: Jira, Confluence, Xray, Veeva QMS, Microsoft Office products (365,Power BI, Co-Pilot, Excel, PPT etc.)

• Compliance: HIPAA,GxP,GDPR,FDA,EUDAMED,GAMP 5,ICH GCP,21 CFR Part 1,NIST AI RMF,EU AI Act, ISO/IEC 42001 Domain Expertise:

• Life Sciences (LSH): Pfizer, J&J, LabCorp, Gilead Sciences, Bio-Tech: Genmab, Healthcare: Aetna.

• Banking, Financial Services & Insurance (BFSI): TD Bank, MasterCard, MetLife, IBM. Certifications:

• ISTQB CSTE, ASTQB CT-AI, ISQI AIML, IBM CDS, PMP, PgMP, ASM, CSPO, IBM DB2, Workday HCM, SAP BO,PMI- RMP,CRCM,CCEP,AIGP,CAP,CMDCP,SAP LS.PMI-CPMAI.

Accomplishments:

Awards and Events:

• Business Insider: Honored with 2024 Global Recognition Award for Pioneering Efforts in Product Authenticity and Safety

• Served as a judge for the 2025 Stevie Awards in the New Innovation Tech

• Selected as an expert judge for the 2025 Global Recognition Awards (GRA), contributing to AI Technology. Scholarly Article Publications:

• AI's Promising Role in Adverse Event Management of Small Molecule Drugs

• AI-Driven Solutions Transform Patient Identification in Clinical Trials

• Implementing AI-driven digital transformation in bioanalysis

.

.

Experience

DIGITAL TRANSFORMATION PROGRAM MANAGER (TEST,AUTOMATION & AI INNOVATION) 01/2025 - Current MONTIFIORE

• Defined and managed the overall test strategy covering functional, integration, regression, UAT, performance, mobile, and compliance testing.

• Management of budgeting, Resource hiring, training, allocation and agile (Scrum, Kanban & SAFe) delivery.

• Aligned with program leadership, IT governance boards, regulatory/compliance teams, and business owners.

• Diligently work with the Business stakeholders, Product owners, Vendors to understand the program Testing scope

• Established a Test Governance Model: entry/exit criteria, defect thresholds, escalation procedures, quality KPIs.

• Defined a test plan that covers all critical SAP Workday migration touchpoints: Core HCM & Payroll, Finance, Time

,racking / Benefits, Mobile workflows, Custom integrations (APIs, middleware, ETL tools),AI-powered self-service bots or analytics, Reporting (Workday Prism, SAP BI decommissioning)

• Ensured planning includes validation of Workday integrations with external systems (e.g., banks, benefits providers, compliance tools).

• Ensured test cases and documentation comply with applicable regulations: SOX, GDPR, HIPAA, FDA 21 CFR Part 11, depending on industry

• Collaborated with internal audit, legal, and risk teams to ensure traceability and audit readiness.

• Maintained complete test documentation artifacts for regulatory inspections.

• Oversee environment planning for Workday tenants (sandbox, preview, production) and SAP systems.

• Planned secure and compliant test data migration strategies (especially for personal, payroll, and financial data).

• Work with data governance to mask or de-identify sensitive data as needed.

• Lead a cross-functional testing team (QA engineers, test analysts, automation engineers, business testers).

• Oversee execution of: End-to-end business process testing (hire to retire, procure to pay, record to report),Data reconciliation testing (pre/post-migration validation),mobile UX testing (across platforms/devices),AI features testing

(e.g., Workday Skills Cloud or ML-powered insights)

• Validate Workday Studio or Workday Extend integrations, middleware layers (e.g., MuleSoft, Dell Boomi, SnapLogic), and downstream SAP system dependencies.

• Ensure robust testing of APIs, file-based integrations (EIBs), and real-time data syncs.

• Define and lead the test automation strategy for: Regression testing (Workday releases are frequent), Data validation and ETL between SAP and Workday, APIs, RPA bots (if applicable)

• Work with tools like Worksoft Certify, Selenium, Tricentis Tosca, or Katalon (if supported). Design and lead a structured UAT process for HR, Finance, Compliance, and Business teams.

• Train users on Workday features, new workflows, and AI-based UI interactions. Capture sign-offs and manage change control across global business units. Workday Prism or Adaptive Insights reports

• Mobile workflows, especially self-service apps,AI/ML insights surfaced in dashboards (skills matching, trend forecasting)

• Implement centralized defect triage processes using tools like JIRA, Azure DevOps, ServiceNow, or ALM.

• Provide real-time test dashboards, progress tracking, and Go/No-Go quality metrics.

• Escalate risks and blockers to the PMO and executive sponsors. DIGITAL TRANSFORMATION PROGRAM LEAD (TEST /VALIDATION/AUTOMATION) 11/2022 – 12/2024 J&J

• Lead global digital transformation programs supporting UDI/DI and traceability initiatives across medical device portfolios.

• Define and execute multi-year roadmaps integrating AI, automation, and digital validation strategies aligned with regulatory and business goals.

• Oversee validation of GxP-regulated systems, including AI/ML-enabled platforms, ensuring compliance with Global HA FDA, EU MDR,etc.

• Implement risk-based testing strategies and digital validation frameworks.

• Drive adoption of AI/ML for predictive quality, automated testing, and regulatory document generation.

.

.

• Establish governance for Good Machine Learning Practices (GMLP) and lifecycle validation of AI systems in regulated environments.

• Lead global implementation of UDI/DI systems and integration with regulatory databases ( FDA GUDID, EU EUDAMED).

• Ensure end-to-end traceability across supply chain manufacturing, and post-market surveillance systems.

• Collaborate with cross-functional teams (RA, QA, IT, Supply Chain) and external vendors to ensure successful delivery and adoption.

• End-to-End product delivery management, status reporting, stakeholder, vendor, Quality, Regulatory affairs, and Delivery team management. (Dev, BA, SIT, UAT, Automation etc.)

• Manage program budgets, KPIs, and executive reporting to ensure alignment with strategic objectives.

• Ensure audit-ready documentation, data integrity, and lifecycle traceability across all digital platforms.

• Support internal and external inspections by regulatory bodies through robust validation and quality records. Digital and AI Global Program Test Manager Responsibilities:

• Develop and define overall testing roadmap, program strategy, approach for AI/Digital product releases (Agile).

• Overseeing the testing and Validation of AI/Digital machine learning models, ensuring accuracy, reliability, compliance, regulatory standards, and validation deliverables.

• Coordinate with the regional deployment (Americas, Europe, and Asia) test managers/Leads, and internal development / 3rd Party technical teams globally, across all types of testing initiatives and phases.

• Understand and report upon testing status for each testing phase, representing internal development and third-party development and testing progress.

• Partner closely with regional testing managers to ensure testing execution and phase gates are met (Entry-Exit) between test activities.

• Prepare overall weekly Global Test Summary Status Report for all types of testing initiatives and phases (Product, System Integration, End to End, Performance, Penetration / Security, User Acceptance and RPA Automation Testing).

• Gather approvals and provide input for “Go / No Go” decisioning for testing and deployment.

• Ensure release management and version control maintenance occurs for each testing environment i.e.: knowledge of release contents including defect fixes, coordinate delivery to the proper environment, manage deployments during correct deployment windows, supervise environment maintenance.

• Own and distribute communications related to environments / releases (outages, maintenance, deployment, releases) where appropriate.

• Ensure that all testing environments are ready and stable, and there are no testing equipment / environment blockers.

• Communicate testing information to all stakeholders (program leadership, 3rd Party partners, test leads, development teams, and testers) so they can plan their re-tests (provide deployment notice / release notes / hot fixes - includes bug fixes, CRs, etc.).

• Direct reginal testing managers to ensure testing updates are actively managed in Test Management Tools such Atlassian Jira, Xray & HP ALM.

• Ensure program testing artifacts, such as the Testing Environment Diagram, Data Diagrams, Test and Validation Deliverables are developed, reviewed, and approved. Testing COE Lead Responsibilities:

• Strategy and Planning: Develop and execute the overall test COE strategy aligned with the organization's goals and objectives. Define the roadmap and long-term vision for the COE, identifying areas of improvement and opportunities for innovation.

• Test Process Definition and Improvement: Establish standardized test processes, methodologies, and best practices within the COE. Continuously evaluate and enhance these processes to drive efficiency, quality, and effectiveness in testing activities.

• Test Governance: Implement governance frameworks and guidelines to ensure adherence to testing standards, policies, and regulatory requirements. Define and monitor key performance indicators (KPIs) to measure the COE's performance and effectiveness.

.

.

• Resource Management: Manage the allocation and utilization of testing resources, including test analysts, engineers, and other team members within the COE. Ensure proper skill development, training, and career growth opportunities for the team.

• Test Tool Evaluation and Adoption: Identify, evaluate, and recommend appropriate testing tools, frameworks, and technologies to support the COE's objectives. Lead the adoption and implementation of these tools, ensuring they align with the organization's needs and provide value.

• Collaboration and Stakeholder Management: Foster collaboration and strong working relationships with stakeholders, including project managers, business analysts, developers, and other teams involved in the software development lifecycle. Act as a trusted advisor and provide guidance on testing-related matters.

• Test Automation and Continuous Improvement: Drive the implementation of test automation strategies and frameworks to improve test efficiency and effectiveness. Promote a culture of continuous improvement within the COE, encouraging innovation and the adoption of emerging testing practices.

• Quality Assurance and Risk Management: Ensure that appropriate quality assurance processes and practices are in place to mitigate risks and deliver high-quality software solutions. Identify and address potential risks and issues that may impact testing activities or project outcomes.

• Training and Mentoring: Provide training, coaching, and mentoring to team members within the COE to enhance their testing skills, knowledge, and professional growth. Foster a learning culture and promote knowledge sharing within the team.

Global UAT Manager for UDI - Key Responsibilities:

• Manage the End-to-End Testing strategy for the STUDIO Global UDI Program based on input from various business product owners and application owners. (multiple sources to Staging to Target SaaS submission tools & BI Reports)

• Design the UDI program administration structure to support defining the stories, risk assessment, manage the E2E UAT testing plan, resource plan, schedule, data readiness, Testing deliverables, Test execution, Defect management and Test Closure.

• Working with Global (America, EU, Asia etc.), Deployment leads, define E2E UAT Testing scope and overall validation test protocol/ plan, strategy, Scripts for all Operational companies and markets or Health Authorities (HA).

• Initiate Review and approval workflow to get the validation testing (SIT/UAT/E2E) deliverables such as Test plan, scripts, Requirement Trace Matrix digitally signed off (Vault, JIRA,HP ALM) by the Business Product and Application owners and Quality Team (TQ/IT QA)

• Manage the E2E Dry run Testing and Defect cycles to identify all the system, integration, functional, data

(completeness and accuracy based on business mapping rule ), design /configuration, environment failures and requirement gaps in advance to the formal testing to avoid unnecessary documentation effort and reduce defect fall rate in UAT/pre prod testing.

• Communicate with all the team (Data teams, different application Dev, SIT,UAT) daily to ensure the E2E connection and functionalities are working and stable.

• Decision making on Go/No Go to production deployment readiness based on the UAT Test Report(Acceptance criteria defined and actual result).

• Host the Daily/weekly meetings/defect triage calls and share the Status reporting to the Leadership teams. AI and Digital Computer System Validation Testing Responsibilities:

• Analysis of the requirements and identifying the scope for functional, user, data integration, data migration and end to end computer system validation testing by participating in the release Sprint grooming sessions and release notes walkthroughs.

• Define the overall testing and data strategy for AI and Machine Learning projects, including the selection of Testing methodologies, tools, and frameworks.

• Author verification Test Protocol and update the testing scope in the Validation Master Test plan.

• Create test estimation, test preparation and test execution plan for all the verification and validation activities in the GXP environment.

• Build the user stories, verification test scenarios and verification scripts.

• Review all testing deliverables and ensure all documents are adherent to quality standards and complaints. (GxP)

.

.

• Upload all the project working, approved validation and verification documents versions into Project repository such as SharePoint, TruVault locations.

• Initiate the review and approval process with Technical Application Owner, Business Product Owner, Independent Quality (TQ) team and gain sign off (e-signature) for all the verification documents such as verification Test Protocol, Test summary report, Change Management, Memo’s in TruVault (Master Signature Book) and verification Scripts, post executed evidence and defects in JIRA, HP ALM and qTest.

• Act as a JIRA /HP ALM admin to maintain all the functional, user, data, security and compliance requirements, business mapping documents, test scripts, test run ids and defects for all STUDIO UDI Digital Identification and Traceability (DI & T) project Releases.

• Verification of the validated QA environment readiness during smoke, dry run, and formal testing.

• Oversee the data validation process of AI and machine learning models to ensure they meet performance and accuracy requirements, also validate the integrity and accuracy of the data used for training and testing AI Models.

• Identify the potential risks associated with the AI system, develop strategies to mitigate risks, conduct impact analysis to understand the potential risks of AI system failures.

• Ensure that AI models and testing process comply with the relevant regulatory standards (GxP, GDPR,HIPAA, FDA,EUDAMED) and continuously review AI process and tools to enhance efficiency and effectiveness.

• Management of Test data,request and set up the Global and Health Authority specific Test data in the source systems, template files based on the requirements and business rules.

• Verification of the E2E Data Migration from SAP ERP to SAAS Cloud applications (Data, Security) based on data mapping rules. (for each franchise and Opco for UDI US using data compare tools – Beyond Compare, PyCharm)

• Verification of the following end to end data integration workflow from the Source systems to Single Source digital cloud global application (Unique Device Identification - Medical Device and Model Submission Tool) in the validated GXP environment.

• Verification of data integration between Source systems, MDM, MBOX, DI DBB (Data integration - Data Back Bone), Rosetta Stone and Single Source application.

• Validate the data ingestion and publication feature by verifying the Single Source Application import functionality and logs from source systems (Manual – excel templates /Integration – xml files).

• Verify the Validation Process (Pass/Fail as per Mapping Rule) and Logs

• Compare the PDI and BUDI data attributes value that are Added/Updated/Deleted from Source to the Single Source Application based on the business mapping using the beyond compare tool, HA GUDE document and Single Source Submission Tool Health Authority (HA) Specific rules for US, EUDAMED, ASIA etc., (globally).

• Test the QA process workflow of the Submission to Health Authority and Health Authority (FDA, EUDAMED, NMPA etc.,) Acknowledgement and Logs.

• Verification of Active and Approved Return feed from Single Source Application to Data back bone (DI).

• Validate UDI Reports and Dashboards (Submission Aging, Audit, Ingestion, Publication) that are generated based on the data Ingested from source and published to single Source through Data Integration layer. (Tableu, Power BI)

• Scheduling of jobs in the validated environment and Manual Run-on demand (i.e., EUDAMED ID generation)

• Execute SQL query against the SAP and Oracle Database to ensure the data completeness and correctness during data integration and Migration testing.

• Verification of the Unique Device Identifier (UDI) and data restrictions/access of the Single Source Submission application for global data in the GXP environment.

• Perform user security testing to verify the data access and restrictions in the GXP application and reports.

• Manages test execution performed by global UAT testing team and assist the business testing team to perform test and defect management in JIRA/HP ALM/qTest, prepare deck on production GO/ NO GO and ensure all exit Criteria are met and escalate testing issues to project and business lead.

• Assess the impact of blocking defects on the testing schedule and management of testing risks and mitigation plan.

• Certify all test deliverables are adherence to software quality standards and complaints, defined and track quality assurance metrics.

• Manage Project Schedule, Host the Project Status meetings, Reports and Automation initiatives for all Single Source, Data Management and Reporting Releases.

• Actively work with the Project manager and prepare release cycles and schedules.

.

.

• Host the daily Scrum and week status meetings with the Business Product Owner, Technical Application Owner, Project Managers, Business Analysts, System Architects, Development and Testing to discuss the current release status, issues, schedule, risks, and challenges of the Testing.

• Report daily and weekly status reports, dashboards, defect deports, Test metrics to the project lead, business leads through mail and meetings by running SQL queries against JIRA.

• Setup Automated process to generate Requirements Traceability Report, Statistics and Metrics dashboards, Roadmaps and Help Desk Management initiatives through JIRA application.

• Partnering with multi-vendor Business Analysts, Development, System, Regression, UAT and Business testing teams across geographies and time zones to provide timely reporting on solutions. Senior Test Manager – Digital platforms and Analytics 11/2021 – 11/2022 Genmab

• Define the overall test strategy for QTC projects, including unit, integration, system, UAT, and regression testing.

• Identify and plan test activities across Salesforce CPQ, Billing, ERP (e.g., SAP/NetSuite), and CLM tools.

• Establish the test entry/exit criteria, test timelines, environments, and reporting cadence.

• Coordinate with DevOps for test data management and sandbox refreshes.

• Build and lead a cross-functional team of testers: functional testers, QA engineers, automation engineers, and business UAT testers.

• Allocate test execution responsibilities across modules (Product Config, Pricing, Quoting, Contracts, Billing, Revenue Recognition).

• Act as a bridge between QA, product managers, solution architects, and business stakeholders.

• Review and validate business requirements, user stories, and process maps to develop comprehensive test cases and scripts.

• Ensure coverage for edge cases, approval workflows, discounts, renewals, and downstream system updates.

• Develop test data sets for various sales scenarios (e.g., multi-currency quotes, bundled products, prorated billing, contract amendments).

• Oversee test case execution across QA, SIT, and UAT phases.

• Monitor defect triaging, resolution, and retesting cycles using tools like JIRA, Zephyr, ALM, or TestRail.

• Drive collaboration across vendors (e.g., Salesforce SI partners) for issue resolution.

• Support business teams in User Acceptance Testing, providing clear scripts, issue-tracking protocols, and training where needed.

• Ensure the UAT cycle validates quote accuracy, pricing rules, billing behavior, contract logic, and integration with ERP.

• Define strategy for automating regression suites, especially for CPQ pricing logic, quote generation, and approval workflows.

• Oversee test automation implementation (e.g., Selenium, Provar for Salesforce, Tosca).

• Validate data integrity and synchronize across systems: Salesforce ERP billing financials.

• Test revenue schedules, credit memos, quote revisions, and cancellations to ensure correct downstream behavior.

• Ensure testing adheres to compliance and audit requirements: SOX, ASC 606/IFRS 15, and change management protocols.

• Maintain documentation for traceability and audit readiness (especially for pricing and revenue-impacting features).

• Report test progress, coverage, defect density, and quality KPIs to leadership.

• Use dashboards or tools to highlight readiness for Go-Live and areas of risk.

• Support cutover rehearsals, validate migrated configurations, and conduct smoke testing post-go-live.

• Set up hypercare support testing, logging issues and regression retesting as needed. GLOBAL PROGRAM MANAGER (TEST/VALIDATION/AUTOMATION)) 12/2018 - 11/2021 Pfizer

• Define and execute end-to-end test strategies for regulatory systems supporting clinical trials, safety reporting, and global product submissions.

• Align testing efforts with simplification goals—reducing manual effort, cycle time, and compliance risk through automation and AI-driven validation.

.

.

• Lead SIT across interconnected platforms ( CTMS, EDC, Argus, RIM, eCTD, labeling systems), ensuring seamless data flow and regulatory traceability.

• Validate the integration of AI/ML modules used in adverse event detection, submission readiness, and document classification.

• Coordinate UAT with regulatory affairs, clinical operations, and safety teams to validate real-world workflows and ensure usability.

• Implement AI-assisted UAT automation to accelerate feedback cycles and improve test coverage.

• Deploy intelligent automation frameworks ( Selenium, Tosca, RPA) to streamline regression testing and compliance checks.

• Integrate AI tools for document parsing, metadata tagging, and predictive risk scoring in regulatory workflows.

• Validate AI outputs for explainability, bias, and audit readiness, ensuring alignment with evolving regulatory expectations.

• Ensure all testing activities comply with GxP, 21 CFR Part 11, EU MDR, and ICH guidelines.

• Lead Computer System Validation (CSV) efforts for AI-enabled and cloud-based regulatory platforms.

• Maintain audit-ready documentation, traceability matrices, and validation protocols.

• Deliver real-time dashboards and test metrics to stakeholders, highlighting automation ROI, and AI performance.

• Capture lessons learned, and drive continuous improvement in test strategy, tooling, and team capability. Program Test Management



Contact this candidate