Introduction
While regulations are not the sole cause of delays in clinical trials, the frameworks that govern them play a significant role in shaping how efficiently studies are designed, conducted, and reviewed. When those frameworks become outdated or inflexible, they risk introducing unnecessary burdens and constraining innovation just when the industry needs it most.
Sponsors and CROs are understandably cautious in how they engage with regulatory authorities—but the clinical research industry is entering a period of rapid transformation. Whether we are ready or not, change is underway. Our shared responsibility is to ensure that such change is both constructive and aligned with the evolving needs of patients, researchers, and regulators.
In this paper, I outline a set of regulatory updates that could meaningfully enhance trial quality, reduce costs, and improve execution timelines. These proposals touch on terminology, data governance, structural reforms, and targeted updates to FDA and EMA procedures and guidance. To help with context, the article cross references 21 CFR Part 11.
Part I – Priority Regulatory Adjustments

1. Eliminate the Term ‘Source’
Current Problem:
The term ‘source’ has become a conceptual bottleneck in two interconnected ways. First, regulatory definitions of eSource—data initially recorded electronically—imply that such data holds a special quality status distinct from other digital records [FDA eSource Guidance, 2013]. Second, this distinction reinforces the continued use of Source Data Verification (SDV), a manual process of checking reported data against a presumed original record. These frameworks are based on assumptions from a paper-based world and are increasingly misaligned with current practice.
Impact:
The perceived need to distinguish between ‘eSource’ and other forms of digital data contributes to duplicated effort and a regulatory focus on form over function. Meanwhile, SDV has been shown in multiple studies to contribute little to actual data quality or trial integrity. Both concepts perpetuate outdated workflows that divert resources from risk-based, outcome-focused approaches.
Furthermore, the regulatory emphasis on the status of ‘source’ data has encouraged the false belief that data closer to its origin is inherently more reliable. In reality, ‘source’ data may be just as susceptible to errors or incompleteness as any other form of raw or unprocessed data. Rather than defining data quality by its origin, the goal should be to identify and govern Authoritative Data Record (ADR)—the consolidated and curated record of a datapoint—through structured, risk-based assurance processes that account for data context, integrity, and use.
I believe the proposal aligns with ICH E6(R3)’s modernised approach to data integrity and context.
Proposed Change:
Retire the term ‘source’ in regulatory contexts. Adopt the concept of Authoritative Data, defined as the authoritative version of data used throughout a system. Update related references in ICH E6 and related guidance to reflect the data governance, ownership, versioning, and controlled access principles that govern digital data [ICH E6(R3) Adopted Jan. 2025]

2. Recognise and Regulate Trusted Third Parties
Relevant Regulation: FDA 21 CFR Part 11, ICH E6(R2) §5.5, ICH E6 (R3)
Current Problem:
There is a false assumption that data must be physically siloed to maintain compliance and control, leading to duplicative systems and friction in data access.
Impact:
Stakeholders—including patients, investigators, and sponsors—experience operational inefficiencies, reduced access to data, and increased technology burden. In many cases, the physical siloing of data is not accompanied by appropriate system integration, leading to poor data quality, fragmented workflows, and reduced transparency. The assumption that regulatory control requires physical rather than logical separation is outdated. Often, these silos reside on the same technology infrastructure and are administered by in-house IT teams and/or service providers, weakening effective, risk-based oversight. Governance is shifting away from validated business logic toward infrastructure-level controls, reducing assurance and increasing reliance on processes that lack explicit validation or stakeholder transparency.
Proposed Change:
Formally define and recognise Trusted Third Parties (TTPs) capable of hosting validated, shared systems under appropriate regulatory governance. Introduce a certification framework that defines how trusted third party solutions are validated to confirm continued access to data across stakeholders, including after trial close-out or sponsor transition. TTPs should be explicitly empowered to support all stakeholders—patients, sites, and sponsors—beyond the control of any single organisation. For example, patients should retain access to their trial data post-study, where appropriate and consented. Investigator sites should be able to interoperate with validated platforms across multiple sponsors and studies, without a single sponsor being able to unilaterally revoke access beyond the confines of the specific trials they sponsor.
The framework must include appropriate ongoing financial support for the TTP’s to enable all stakeholders – regardless of financial means – to be able to gain access to information within the scope of the retention period.

3. Embrace Modern Notions of Certified Copy
Relevant Regulation: FDA 21 CFR Part 11 §11.10(e), §11.100
Current Problem:
The concept of a “certified copy” has not evolved with technology, resulting in a continued reliance on physical or manual signatures and print-equivalent attestations. Our current guidance continues to attach ‘document’ as the primary base manifestation of information.
Impact:
This outdated model inhibits the use of secure, scalable, and cryptographically signed digital records that are more secure and traceable than paper.
Proposed Change:
Amend regulatory definitions to include digitally attested records (as opposed to documents) as certified copies. Introduce criteria for digital certification based on checksum, encryption, and audit logging to ensure authenticity and traceability.

4. Redefine Computer Systems Validation (CSV) and the Concept of Validation
Relevant Regulation: FDA 21 CFR Part 11 §11.10(a), §11.300; aligned with GAMP 5 Second Edition
Current Problem:
The traditional concept of Computer Systems Validation (CSV) is grounded in waterfall-era development models, assuming static systems with infrequent changes. It does not reflect modern software delivery, particularly the norm of continuous release cycles through agile or DevOps pipelines. Furthermore, existing definitions of validation often overlook the importance of different layers of an application. Configuration, in particular, shapes today how an application functions in practice. In most modern platforms, it is the configuration layer—not the underlying software—that defines data, workflow logic, data behaviour, and compliance controls. The software, the configuration of the application on the software and any licensee adaptions should all be catered for in a modern CSV process.
In addition, the term ‘validation’ itself is inconsistently applied. Regulatory language often implies that a system is simply ‘validated’—without stating validated to meet what. This leads to ambiguity in scope, expectations, and evidence.
Impact:
- Excessive effort is placed on validating software that is later reconfigured beyond the scope of the initial validation.
- Sponsors duplicate vendor validation without added assurance.
- Validation efforts focus on documentation over operational value.
- Audit findings can be inconsistent due to unclear definitions of what constitutes a validated state.
Proposed Change:
- Redefine validation to state that systems must be “validated to meet intended use”, as articulated through risk-based requirements.
- Adopt the GAMP 5 Second Edition distinction between:
- Software validation (vendor-level);
- Configuration validation (sponsor/site-specific);
- User acceptance testing (operational context).
- Recognise continuous validation practices, including automated testing and modular release assurance, as compliant when properly documented.
Detailed changes are outlined in Part II: Updates to FDA 21 CFR Part 11 (Section D).

5. Introduce HyperTrial Designation
Relevant Policy Reference: Not yet formally codified — proposed for new regulatory framework
Current Problem:
All clinical trials are subjected to the same procedural and documentation burdens, regardless of their digital maturity, risk profile, or operational agility.
During the COVID-19 pandemic, regulators demonstrated an ability to support trials with unprecedented speed and flexibility—leveraging real-time data review, adaptive protocols, and remote monitoring. These methods were essential to delivering safe, effective vaccines under intense time pressure. Yet, no enduring framework exists to sustain such innovation in standard regulatory practice.
Impact:
Sponsors with high digital readiness are held back by outdated processes and slower timelines, while innovation in trial design and conduct is suppressed.
Proposed Change:
Create a new designation—HyperTrial—for studies meeting criteria in technology maturity, risk readiness, and data assurance. This designation would allow use of:
- Exclusively digital documentation and signatures
- Real-time data exchange with regulators
- Agile, milestone-based regulatory review cycles
- Proportionate, risk-based inspection models
- Pre-qualified digital platforms and Trusted Third Parties (TTPs)
- Expanded use of continuous statistical monitoring and adaptive protocol governance

6. Enable Adaptive and Continuous Statistical Decision-Making in Biometrics
Current Problem:
Traditional statistical review models are structured around fixed, milestone-driven analysis plans that culminate from an interim or full database lock before any substantive interpretation of results occurs. This approach reflects limitations in historical data aggregation, cleaning, and integration practices rather than the capabilities of modern technology.
Impact:
Sponsors and regulators are often forced to wait until the end of a study before detecting trends or drawing inferences, even when reliable patterns may be visible far earlier. This delays opportunities to adapt study design or terminate failing studies, increasing patient burden and sponsor cost.
Proposed Change:
Introduce regulatory support for the continuous, automated statistical assessment of clinical trial data against a pre-specified statistical analysis plan. Where data quality and completeness thresholds are met, such interim analytics should be permitted to:
- Indicate early signals of efficacy or futility;
- Trigger pre-defined adaptations, such as dosage adjustments, cohort expansion, or early termination;
- Reduce reliance on the binary outcome of database lock as the singular point of analysis.
This shift would reflect growing confidence in real-time data pipelines and risk-based data monitoring strategies, allowing faster, more patient-centric decision-making without compromising scientific rigour or regulatory oversight.
Part II – Proposed Updates to FDA 21 CFR Part 11
Introduction
The foundation of digital compliance in clinical research—FDA 21 CFR Part 11—has remained largely unchanged since its adoption in 1997. While guidance documents have helped interpret its application, the underlying regulation no longer reflects the technological realities or operational models of modern clinical trials.
In keeping with the principles of the Administrative Procedure Act, formal amendments to 21 CFR Part 11 must be proposed and adopted through notice-and-comment rulemaking. The recommendations outlined below are intended as a concrete contribution to that process, offering structured, implementable updates to Part 11 that align with current best practices in cloud computing, data integrity, and agile software delivery.
Rather than supplementing legacy requirements with external guidance or workaround interpretations, these proposals are designed to replace outdated provisions with clear, technology-agnostic standards that preserve regulatory intent while enabling innovation.
The goal is not to reduce regulatory oversight—but to recalibrate it toward risk-based assurance, cross-stakeholder access, and real-time data governance. Each section below maps to existing clauses within Part 11 and provides language or conceptual direction for how they might be updated through formal regulatory change. Non exhaustive references to related procedures from ICH, NIST or other bodies are also included.
A. Multi-Tenant Cloud Architectures (Related to §11.10, §11.30)
Update Part 11 to define conditions under which multi-tenant Software as a Service (SaaS) systems may be used in regulated environments. Specify that logical data segregation, robust role-based access control, and validated audit logging meet the same integrity standards as physically siloed environments
(FDA ER/ES Guidance, 2017).
B. Identity and Access Management (Related to §11.100, §11.200)
Allow identity federation through enterprise IAM platforms, using protocols such as SAML, OAuth2, and OpenID Connect. Clarify that multi-factor authentication, hardware tokens, or certificate-based access may replace legacy password schemes, provided that access is traceable and role-appropriate
(NIST 800-63B).
C. Certified Copy and Digital Signatures (Related to §11.10(e), §11.50, §11.70)
Redefine “certified copy” to include digitally signed documents that meet defined criteria for cryptographic authenticity and traceability. Remove the presumption of printed output as the gold standard. Permit automated digital signing within validated systems (FDA ER/ES, 2017).
D. Validation of Agile and Continuous Delivery Systems (Related to §11.10(a), §11.300)
Expand validation guidance to cover DevOps models including continuous integration and delivery (CI/CD). Support use of GAMP 5 Second Edition principles, automated test suites, and modular risk assessments as acceptable validation strategies (FDA Computer Software Assurance for Production and Quality System Software (CSA) Draft, 2022).
E. Data Residency and Ownership (Related to §11.10(c), §11.30)
Clarify that compliance is maintained when data is stored across borders or cloud regions, as long as encryption, access control, and regulatory access are preserved. Define logical data ownership based on governance rather than physical possession.
F. Trusted Third Parties (Related to §11.10(a), §11.300)
Introduce a regulatory framework for certification and periodic audit of third-party platforms acting as custodians of regulated trial data. Enable shared use of platforms across sponsors with consistent validation and oversight.
G. API-Based Interoperability (Related to §11.10(b), §11.50)
Acknowledge version-controlled, secure API interfaces as acceptable methods for data transfer, provided audit logs track data creation, modification, and access. Permit real-time system integration across validated environments.
H. Support for Decentralised and Patient-Facing Technologies (Related to §11.10, §11.300)
Align Part 11 with FDA’s Decentralised Clinical Trials guidance. Include provisions for remote identity verification, eConsent via mobile platforms, and patient-generated health data from BYOD devices. Ensure these systems are held to appropriate validation standards.
I. Risk-Based Documentation (Related to §11.10, §11.300)
Allow for scaled documentation requirements based on system criticality. Accept modular validation strategies where low-risk systems may use templated documentation and automated testing rather than traditional validation scripts.
J. Regulatory Harmonisation (Related to §11.10, §11.300)
Propose formal alignment of Part 11 with international standards, including ICH E6(R3), ISO/IEC 27001, and GAMP 5. Include references to mutually recognised certification schemes to reduce duplicate compliance efforts in global trials.
K. Continuous Statistical Monitoring and Adaptive Analysis (Related to §11.10, §11.50, §11.70)
Amend Part 11 to explicitly permit and encourage the continuous statistical evaluation of clinical trial data against a pre-specified statistical analysis plan. Recognise validated, automated pipelines that support ongoing data aggregation, cleaning, and analytics, enabling:
- Early identification of statistically meaningful trends;
- Triggering of adaptive mechanisms pre-approved in the protocol (e.g., sample size re-estimation, early termination, cohort adjustment);
- Documented, audit-trailed decision-making using risk-based thresholds.
This change would align Part 11 with the intent of adaptive design frameworks (e.g., FDA guidance on adaptive clinical trials and ICH E9[R1] on estimands and sensitivity analysis) and support real-time evidence generation without compromising data integrity or regulatory transparency.
Closing Reflection
These proposals are not a rejection of regulation—they are a blueprint for intelligent, adaptive oversight in an era of digital transformation. By modernising outdated frameworks and embracing structured innovation, we can improve compliance, reduce inefficiencies, and ultimately bring safer, higher-quality clinical trials to patients faster.
References
- FDA (2013). Guidance for Industry: Electronic Source Data in Clinical Investigations.
- FDA (2003). Part 11, Electronic Records; Electronic Signatures — Scope and Application.
- FDA (2017). Use of Electronic Records and Electronic Signatures in Clinical Investigations.
- FDA (2022). Computer Software Assurance for Production and Quality System Software – Draft Guidance.
- ICH (2025). E6(R3) Good Clinical Practice. Adopted January 2025; effective from July 2025.
- ICH (2020). E9(R1) Statistical Principles for Clinical Trials – Addendum.
- EMA (2023). Guideline on Computerised Systems and Electronic Data in Clinical Trials – Draft.
- GAMP 5 Second Edition (2022). A Risk-Based Approach to Compliant GxP Computerized Systems.
- FDA (2023). Decentralized Clinical Trials for Drugs, Biological Products, and Devices – Draft Guidance.
- FDA (2020). Conduct of Clinical Trials of Medical Products during the COVID-19 Public Health Emergency.
- FDA (2019). Adaptive Designs for Clinical Trials of Drugs and Biologics – Guidance for Industry.
- FDA (2018). Data Integrity and Compliance With Drug CGMP – Guidance for Industry.
- NIST (2020). Special Publication 800-63B: Digital Identity Guidelines.
- FDA (2021–present). Real-Time Oncology Review (RTOR) Pilot Program.
Prepared by: Doug Bain, Consulting Partner, ClinFlo Consulting Limited.
Proof reading and edits:
- Tony Hewer. Also article published PhUse 2012 – New Approaches to Validation for SaaS-based Clinical Computing Solutions in the Cloud
- Leif Puddefoot
Additional thanks to Ron Fitmartin, Former Senior Informatics Advisor at FDA / CBER / Data Standards Staff for support and input as well as Ben Young for technology related input
Date: June 2025
Discover more from ClinFlo Consulting
Subscribe to get the latest posts sent to your email.