Getting CCDM certification is an important step in your career, but preparing for it can feel challenging. At skillcertexams, we know that having the right resources and support is essential for success. That’s why we created a platform with everything you need to prepare for CCDM and reach your certification goals with confidence.
Your Journey to Passing the Certified Clinical Data Manager CCDM Exam
Whether this is your first step toward earning the Certified Clinical Data Manager CCDM certification, or you're returning for another round, we’re here to help you succeed. We hope this exam challenges you, educates you, and equips you with the knowledge to pass with confidence. If this is your first study guide, take a deep breath—this could be the beginning of a rewarding career with great opportunities. If you’re already experienced, consider taking a moment to share your insights with newcomers. After all, it's the strength of our community that enhances our learning and makes this journey even more valuable.
Why Choose SkillCertExams for CCDM Certification?
Expert-Crafted Practice Tests
Our practice tests are designed by experts to reflect the actual CCDM practice questions. We cover a wide range of topics and exam formats to give you the best possible preparation. With realistic, timed tests, you can simulate the real exam environment and improve your time management skills.
Up-to-Date Study Materials
The world of certifications is constantly evolving, which is why we regularly update our study materials to match the latest exam trends and objectives. Our resources cover all the essential topics you’ll need to know, ensuring you’re well-prepared for the exam's current format.
Comprehensive Performance Analytics
Our platform not only helps you practice but also tracks your performance in real-time. By analyzing your strengths and areas for improvement, you’ll be able to focus your efforts on what matters most. This data-driven approach increases your chances of passing the CCDM practice exam on your first try.
Learn Anytime, Anywhere
Flexibility is key when it comes to exam preparation. Whether you're at home, on the go, or taking a break at work, you can access our platform from any device. Study whenever it suits your schedule, without any hassle. We believe in making your learning process as convenient as possible.
Trusted by Thousands of Professionals
Over 10000+ professionals worldwide trust skillcertexams for their certification preparation. Our platform and study material has helped countless candidates successfully pass their CCDM exam questions, and we’re confident it will help you too.
What You Get with SkillCertExams for CCDM
Realistic Practice Exams: Our practice tests are designed to the real CCDM exam. With a variety of practice questions, you can assess your readiness and focus on key areas to improve.
Study Guides and Resources: In-depth study materials that cover every exam objective, keeping you on track to succeed.
Progress Tracking: Monitor your improvement with our tracking system that helps you identify weak areas and tailor your study plan.
Expert Support: Have questions or need clarification? Our team of experts is available to guide you every step of the way.
Achieve Your CCDM Certification with Confidence
Certification isn’t just about passing an exam; it’s about building a solid foundation for your career. skillcertexams provides the resources, tools, and support to ensure that you’re fully prepared and confident on exam day. Our study material help you unlock new career opportunities and enhance your skillset with the CCDM certification.
Ready to take the next step in your career? Start preparing for the SCDM CCDM exam and practice your questions with SkillCertExams today, and join the ranks of successful certified professionals!
SCDM CCDM Sample Questions
Question # 1
QUESTION 150
On a dose escalation study, the Data Manager notices one site has a much higher number of queries
than other sites and most are older than 30 days. The Data Safety Monitoring Board will meet in
three weeks. What should the Data Manager providing CRO oversight do?
A. Notify the CRO's Clinical Leader about the concerns B. Call the site directly and ask the study coordinator about the concerns C. Consult the CRO's Lead Data Manager and the CRO's Project Leader D. Ignore it for now and check back next week
Answer: C Explanation:
The correct action is to consult the CROs Lead Data Manager and CROs Project Leader (Option C) to
ensure the issue is addressed through the appropriate oversight and escalation process.
According to the GCDMP (Chapter: Project Management and Communication), when a sponsor Data
Manager identifies significant data management issues under CRO oversight ” such as aging queries
or site performance disparities ” communication must follow the established governance and
escalation pathway defined in the Scope of Work (SOW) and Data Management Plan (DMP).
Directly contacting the site (Option B) bypasses the CROs chain of command and violates
communication protocols. Notifying only the Clinical Leader (Option A) is insufficient, and ignoring
the issue (Option D) jeopardizes the Data Safety Monitoring Board (DSMB) review timeline.
Therefore, Option C ensures a documented, collaborative approach to problem resolution within the
contractual oversight structure.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Project Management and
Communication, Section 7.1 “ Oversight of CRO Data Management Activities
ICH E6 (R2) GCP, Section 5.2 “ Contract Research Organization Responsibilities
FDA Guidance for Industry: Oversight of Clinical Investigations “ Sponsor and CRO Roles and Communication Pathways
Question # 2
In reviewing the adverse events for a subject, a data manager notices one recorded as "worsening of
migraine." After reviewing the rest of the adverse events and finding no other migraine recordings,
what is the data manager's next step?
A. Look for any adverse event instance of headache and assume the events are similar. B. Query the site for the first adverse event occurrence of migraine. C. Check the medical history for recording of a history of migraines. D. Query the site for more information on the adverse event, "worsening of migraine."
Answer: D Explanation:
When a data inconsistency arises ” such as a record of œworsening of migraine without prior
documentation of a migraine episode ” the Data Manager should query the site for clarification (Option D).
According to the GCDMP (Chapter: Data Validation and Cleaning), data managers must raise a
clarification query whenever data appear incomplete, inconsistent, or ambiguous. The site must
confirm whether œworsening of migraine refers to a new event or an exacerbation of a preexisting
condition. This clarification ensures accurate safety reporting and appropriate medical coding (e.g., MedDRA classification).
Checking the medical history (Option C) may help but does not resolve the inconsistency. Assuming a
relationship (Option A or B) without verification would violate Good Clinical Data Management
Practice and potentially misrepresent the adverse event.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Validation and Cleaning, Section 6.3 “ Query Generation and
Resolution
ICH E2A “ Clinical Safety Data Management: Definitions and Standards for Expedited Reporting, Section II “ Data Clarification Requirements
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations “ Data Query Management
Question # 3
What significant difference is there in the DM role when utilizing an EDC application?
A. Data updates are implemented by the sites B. Database validation is not required C. Metrics generation is required D. Tracking of eCRFs is a monitor's responsibility
Answer: A Explanation:
The most significant difference in the Data Managers role when using an Electronic Data Capture
(EDC) system is that data updates are implemented directly by site personnel (Option A).
According to the GCDMP (Chapter: Electronic Data Capture Systems), EDC technology shifts
responsibility for data entry and correction from the sponsor or CRO to the investigator site, enabling
real-time data entry and validation. This eliminates the need for double entry or remote data
transcription, allowing Data Managers to focus on system validation, query management, and data
quality oversight rather than physical data handling.
However, the EDC system still requires full validation (contrary to Option B). Metrics generation
(Option C) and CRF tracking (Option D) are important but not unique to EDC-based workflows.
Thus, the correct answer is Option A “ Data updates are implemented by the sites, reflecting the
most fundamental operational shift introduced by EDC systems.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Electronic Data Capture (EDC)
Systems, Section 4.1 “ Role of the Data Manager in EDC
ICH E6 (R2) GCP, Section 5.5.3 “ Electronic Data Entry and Responsibilities
FDA 21 CFR Part 11 “ Electronic Records and Signatures: Data Entry Responsibilities
Question # 4
Which protocol section most concisely conveys timing of data collection throughout a study?
A. Study endpoints section B. Study schedule of events C. Protocol synopsis D. ICH essential documents
Answer: B Explanation:
The Study Schedule of Events (SoE) section in the protocol is the most concise and comprehensive
representation of the timing of data collection throughout a study.
According to the Good Clinical Data Management Practices (GCDMP, Chapter: Data Management
Planning and Study Start-up) and ICH E6 (R2) GCP, the SoE outlines what assessments, procedures,
and data collections occur at each study visit (e.g., screening, baseline, treatment visits, follow-up).
This table is a foundational tool for CRF design, database structure, and edit-check development,
ensuring alignment between the protocol and data management systems. While the study endpoints section (A) defines what is measured, and the protocol synopsis (C)
summarizes the design, only the schedule of events (B) specifies when data collection occurs for each
parameter. The ICH essential documents (D) pertain to regulatory documentation, not study visit
timing.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Management Planning and
Study Start-up, Section 4.1 “ Using the Schedule of Events for Database Design
ICH E6 (R2) GCP, Section 6.3 “ Trial Design and Schedule of Assessments
FDA Guidance for Industry: Protocol Design and Data Collection Standards
Question # 5
A Data Manager is designing a report to facilitate discussions with sites regarding late dat
a. Which is the most important information to display on the report to encourage sites to provide
data?
A. Number of forms entered in the last week B. Expected versus actual forms entered C. List of outstanding forms D. Total number of forms entered to date
Answer: C Explanation:
In managing site data timeliness, the most actionable and effective tool is a report listing all
outstanding (missing or incomplete) CRFs.
According to GCDMP (Chapter: Communication and Study Reporting), Data Managers must provide
site-level performance reports highlighting:
Outstanding CRFs not yet entered,
Unresolved queries, and
Pending data corrections.
Such reports help sites prioritize and address data gaps efficiently.
Option A and D are historical metrics without actionable context.
Option B gives a general overview but lacks specific site-level actionability.
Hence, option C (List of outstanding forms) provides the clearest and most motivating feedback to
sites for timely data entry and query resolution.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Communication and Study Reporting, Section 5.3 “ Data Timeliness and
Reporting Metrics
ICH E6(R2) GCP, Section 5.1.1 “ Sponsor Oversight and Data Communication Requirements
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.5 “ SiteLevel Data Timeliness Reporting
Question # 6
For clinical investigational sites on an EDC trial, which of the following archival options allows
traceability of changes made to data?
A. Storing the computer used at the clinical investigational site B. Paper copies of the source documents C. PDF images of the final eCRF screens for each patient D. ASCII files of the site's data and related audit trails
Answer: D Explanation:
Regulatory agencies such as the FDA and ICH require that electronic data be retained in a format that
preserves audit trails and traceability.
While PDF images (option C) provide a static representation of data, they do not preserve the
underlying audit trail (i.e., who changed what, when, and why). The ASCII data files with
corresponding audit trails (option D) provide complete transparency and comply with 21 CFR Part 11
and GCDMP archival standards.
Option A (storing computers) is unnecessary and impractical, and Option B (paper source
documents) are site records, not system archives.
Hence, option D is correct ” ASCII data files with audit trails meet traceability and compliance
standards.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Database Lock and Archiving, Section 5.4 “ Archival Formats and Audit Trail
Retention
ICH E6(R2) GCP, Section 5.5.3 “ Data Integrity, Audit Trails, and Record Retention
FDA 21 CFR Part 11 “ Electronic Records; Audit Trail and Retention Requirements
Question # 7
A Data Manager is designing a CRF for a study for which the efficacy data are not covered by the
current SDTM domains. Which search should the Data Manager do?
A. Use controlled terminology covering the needed concepts B. Work with the study team to define new data elements C. Search for relevant data element standards D. Advise the study team not to collect the data
Answer: C Explanation:
When existing SDTM (Study Data Tabulation Model) domains do not cover specific efficacy data, the
best practice is to first search for relevant data element standards that may be available through
CDISC CDASH (Clinical Data Acquisition Standards Harmonization) or other recognized industry
standards.
Per GCDMP (Chapter: Standards and Data Integration), Data Managers must ensure that new CRF
elements are consistent with standardized definitions, controlled terminology, and data models to
support interoperability, future analysis, and regulatory submission.
If no existing standards exist, only then should the Data Manager collaborate with the study team to
define new elements ” but standard searches always come first.
Thus, option C is correct ” search for relevant data element standards ensures alignment with CDISC
best practices and regulatory expectations.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Standards and Data Integration, Section 5.1 “ Use of CDISC Standards in CRF
Design
CDISC CDASH Implementation Guide, Section 4.1 “ Standardization of Data Collection Fields
FDA Study Data Technical Conformance Guide (SDTCG), Section 2.4 “ Use of Standard and Custom
Domains
Question # 8
ePRO data are collected for a study using study devices given to subjects. Which is the most
appropriate quality control method for the data?
A. Programmed edit checks to detect out of range values after submission to the database B. Manual review of data by the site study coordinator at the next visit C. Data visualizations to look for site-to-site variation D. Programmed edit checks to detect out of range values upon data entry
Answer: D Explanation:
When electronic patient-reported outcomes (ePRO) devices are used, data are captured directly by
subjects through validated devices and transmitted electronically to the study database. To ensure
real-time data quality control, programmed edit checks should be implemented at the point of data
entry ” that is, as subjects input data into the device.
According to Good Clinical Data Management Practices (GCDMP, Chapter: Data Validation and
Cleaning), front-end programmed edit checks are the optimal method to prevent entry of invalid or
out-of-range values in ePRO systems. This helps maintain data accuracy at the source, minimizing
downstream queries and data cleaning workload.
Options A and B involve post-submission or manual review, which is less efficient and not compliant
with the principle of first-pass data validation. Option C (visualization) is a valuable secondary QC
method for trends, but not for immediate data validation.
Therefore, option D is correct ” programmed edit checks upon data entry ensure immediate validation and higher data integrity.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Validation and Cleaning, Section 5.3 “ Automated Edit Checks and
Front-End Validation
ICH E6(R2) GCP, Section 5.5.3 “ Computerized System Controls and Validation
FDA Guidance for Industry: Electronic Source Data in Clinical Investigations (2013), Section 6 “ RealTime Data Quality Control
Question # 9
Which action has the most impact on the performance of a relational database system?
A. Entering data into the database from CRFs B. Loading a large lab data file into the database C. Executing a properly designed database query D. Making updates to data previously entered into the database
Answer: B Explanation:
In a relational database system used in clinical data management, performance refers to how
efficiently the system processes transactions, retrieves data, and handles large volumes of
information without delay or data integrity issues. Among the listed options, loading a large lab data
file into the database (Option B) has the most significant impact on database performance.
According to the Good Clinical Data Management Practices (GCDMP, Chapter on Database Design
and Build), the bulk data load process ” such as importing large external datasets (e.g., central lab
data, ECG results, or imaging metadata) ” can be computationally intensive. This process engages
the databases input/output (I/O) subsystem, indexing mechanisms, and transaction logs
simultaneously, often locking tables temporarily and consuming significant memory and processing
resources.
Unlike standard CRF data entry (Option A) or record updates (Option D), which are incremental and
typically processed in smaller transactional batches, bulk loading operations handle thousands or
millions of rows at once. If not optimized (e.g., via staging tables, indexing strategies, or commit
frequency control), such operations can degrade system performance, slow down concurrent user
access, and increase the risk of transaction failure.
Executing a properly designed query (Option C) can also be resource-intensive depending on data volume and join complexity, but when queries are properly optimized (using indexed keys, efficient
SQL joins, and selective retrieval), their impact is generally controlled and transient compared to
large data imports.
Therefore, as outlined in the GCDMP Database Design and Build and FDA Computerized Systems
Guidance, the most performance-impacting activity in a relational database is bulk loading large
external datasets, making Option B the correct answer.
Reference (CCDM-Verified Sources):
Society for Clinical Data Management (SCDM), Good Clinical Data Management Practices (GCDMP),
Chapter: Database Design and Build, Section 6.7 “ Database Performance and Optimization
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6 “ System
Performance and Data Handling Efficiency
ICH E6 (R2) Good Clinical Practice, Section 5.5 “ Data Handling and Record Integrity
CDISC Operational Data Model (ODM) Implementation Guide “ Bulk Data Transfer and Validation
Considerations
Question # 10
A Data Manager is asked to manage SOPs for a department. Given equal availability of the following
systems, which of the following is the best choice for managing the organizational SOPs?
A. Document management system B. Customized Excel spreadsheet C. Learning management system D. Existing paper filing system
Answer: A Explanation:
The best choice for managing Standard Operating Procedures (SOPs) in a compliant and auditable
manner is a Document Management System (DMS).
According to the GCDMP (Chapter: Regulatory Requirements and Compliance) and ICH E6 (R2), SOPs
must be version-controlled, securely stored, retrievable, and auditable. A validated DMS supports
controlled access, document lifecycle management (draft, review, approval, and archival), and electronic audit trails, ensuring full compliance with FDA 21 CFR Part 11 and Good Documentation
Practices (GDP).
While Learning Management Systems (C) track training, they are not intended for document control.
Spreadsheets (B) and paper systems (D) cannot provide adequate version tracking, access security, or
audit capability required for regulatory inspection readiness.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Regulatory Requirements and
Compliance, Section 5.2 “ SOP Management and Document Control
ICH E6 (R2) GCP, Section 5.5.3 “ Document and Record Management
FDA 21 CFR Part 11 “ Electronic Records and Signatures, Section 11.10 “ System Validation and
Document Controls
Question # 11
For ease of data processing, the study team would like the database codes for a copyrighted rating
scale preprinted on the CRF. What is the most critical task that the CRF designer must do to ensure
the data collected on the CRF for the scale are reliable and will support the results of the final
analysis?
A. Consult the independent source and determine database codes will not influence subject responses. B. Consult the study statistician regarding the change and determine that database codes will not influence the analysis. C. Consult the independent source of the rating scale for approval and document that continued validity of the tool is not compromised. D. Complete the requested changes to the instrument and ensure the correct database codes are associated with the appropriate responses.
Answer: C Explanation:
When using a copyrighted or validated rating scale (e.g., Hamilton Depression Scale, Visual Analog
Pain Scale), any modification to the original instrument, including preprinting database codes on the
CRF, must be approved by the instruments owner or licensing authority to ensure the validity and
reliability of the instrument are not compromised.
According to the GCDMP (Chapter: CRF Design and Data Collection), validated rating scales are
psychometrically tested tools. Any visual or structural modification (such as adding codes, changing
layout, or rewording questions) can invalidate prior validation results. Therefore, the CRF designer
must consult the independent source (copyright holder) for approval and document that the validity
of the tool remains intact.
Merely consulting statisticians (option B) or verifying database alignment (option D) does not ensure
compliance. Thus, Option C ensures scientific and regulatory integrity.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: CRF Design and Data Collection,
Section: Section 6.1 “ Use of Validated Instruments and Rating Scales
ICH E6 (R2) GCP, Section 5.5.3 “ Validation of Instruments and Data Capture Tools
FDA Guidance for Industry: Patient-Reported Outcome Measures “ Use in Medical Product
Development to Support Labeling Claims, Section 4 “ Instrument Modification and Validation
Question # 12
During a database audit, it was determined that there were more errors than expected. Who is
responsible for assessing the overall impact on the analysis of the data?
A. Data Manager B. Statistician C. Quality Auditor D. Investigator
Answer: B Explanation:
The Statistician is responsible for assessing the overall impact of data errors on the analysis and study
results.
According to the Good Clinical Data Management Practices (GCDMP, Chapter: Data Quality
Assurance and Control) and ICH E9 (Statistical Principles for Clinical Trials), while the Data Manager
ensures data accuracy and completeness through cleaning and validation, the Statistician determines
whether the observed data discrepancies are statistically significant or if they may affect the validity,
power, or interpretability of the studys outcomes.
The Quality Auditor (C) identifies and reports issues but does not quantify analytical impact. The
Investigator (D) is responsible for clinical oversight, not statistical assessment. Thus, after a database
audit, the Statistician (B) performs a formal evaluation to determine whether the magnitude and
nature of the errors could bias results or require reanalysis.
Reference (CCDM-Verified Sources):
SCDM Good Clinical Data Management Practices (GCDMP), Chapter: Data Quality Assurance and
Control, Section 7.3 “ Data Audit and Impact Assessment
ICH E9 “ Statistical Principles for Clinical Trials, Section 3.2 “ Data Quality and Analysis Impact
Assessment
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations “ Data Validation
and Analysis Review
Question # 13
What additional task does the site study coordinator role perform when utilizing an EDC application
compared to paper CRF?
A. Resolving queries B. Data entry C. Data curation D. Medical record abstraction
Answer: B Explanation:
In paper-based trials, site staff (e.g., study coordinators) record data manually on paper Case Report
Forms (CRFs), which are later transcribed by data entry personnel into an electronic database.
However, in EDC-based studies, the site coordinator is directly responsible for entering data into the
EDC system. This eliminates the need for centralized double data entry and shortens data cleaning
timelines.
The GCDMP (Chapter: Electronic Data Capture Systems) states that EDC systems shift certain tasks,
including data entry, initial query response, and source verification preparation, to the site level. Yet,
data entry remains the most significant additional responsibility compared to paper-based studies.
Option A (Query resolution) is performed in both EDC and paper-based systems.
Option C (Data curation) is typically a Data Management function.
Option D (Medical record abstraction) is part of source documentation, not specific to EDC.
Thus, option B (Data entry) is correct ” it is the additional site coordinator duty unique to EDC
environments.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Electronic Data Capture (EDC) Systems, Section 5.3 “ Site Responsibilities
and Workflow Changes
ICH E6(R2) GCP, Section 5.5.3 “ Data Entry and Role Delegation in Computerized Systems
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.2 “ SiteLevel Data Entry Controls
Question # 14
It has been identified that ten adverse events were not reported in the trial prior to the database
lock. What action should be taken to determine the next step?
A. Get the AE data entered immediately so the database can be locked again. B. Evaluate the potential effect of the omission on the validity of the safety and efficacy analysis. C. Notify upper management immediately so the monitor can contact the site. D. Check the data from all sites again before relocking the database.
Answer: B Explanation:
When adverse events (AEs) are discovered after a database lock, the appropriate first step is to
evaluate the impact of the missing data on the integrity, safety analysis, and regulatory validity of the
study results.
According to GCDMP (Chapter: Data Quality Assurance and Control), any post-lock data discovery
requires a root cause assessment and impact analysis before deciding whether to unlock the
database. The key question is whether the missing AEs:
Affect primary safety endpoints,
Introduce bias in safety reporting, or
Alter efficacy conclusions.
Based on the assessment, the Data Management and Biostatistics teams determine if unlocking and
correction are justified. Simply entering data immediately (A) or repeating checks (D) without
analysis may violate data control procedures.
Hence, option B is correct ” the first step is to assess the impact on data validity and analysis.
Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Data Quality Assurance and Control, Section 5.5 “ Post-Lock Findings and
Impact Assessment
ICH E6(R2) GCP, Section 5.1.1 “ Quality Management and Risk Assessment
FDA Guidance for Industry: Computerized Systems Used in Clinical Investigations, Section 6.5 “ PostLock Data Management
Question # 15
Which of the following tasks would be reasonable during a major upgrade of a clinical data
management system?
A. All of the data formats in the archive should be updated to new standards. B. The ability to access and read the clinical data archive should be tested. C. The data archive should be migrated to an offsite database server. D. All of the case report forms should be pulled and compared to the archive.
Answer: B Explanation:
During a major system upgrade, it is critical to verify that archived data remain accessible, readable,
and intact following the implementation.
According to the GCDMP (Chapter: Database Lock and Archiving), regulatory requirements such as 21
CFR Part 11 and ICH E6(R2) mandate that archived data must remain retrievable in a human-readable
format for the duration of retention (often years after study completion).
Therefore, as part of validation and verification testing, organizations must confirm that existing
archives can still be accessed using the upgraded system or compatible tools.
Option A: Updating archive formats could alter original data integrity (noncompliant).
Option C: Migration offsite is an IT infrastructure task, not directly tied to the upgrade process.
Option D: Comparing CRFs to archives is unnecessary unless data corruption is suspected.
Hence, option B (testing archive accessibility) is the correct and compliant approach. Reference (CCDM-Verified Sources):
SCDM GCDMP, Chapter: Database Lock and Archiving, Section 5.4 “ System Upgrades and Archive
Validation
ICH E6(R2) GCP, Section 5.5.3 “ System Validation and Data Retention
FDA 21 CFR Part 11 “ Data Archiving, Retention, and Retrieval Requirements