We present an overview of the Centers for Quantitative Imaging Excellence (CQIE) program, which was initiated in 2010 to establish a resource of clinical trial-ready sites within the National Cancer Institute (NCI)-designated Cancer Centers (NCI-CCs) network. The intent was to enable imaging centers in the NCI-CCs network capable of conducting treatment trials with advanced quantitative imaging end points. We describe the motivations for establishing the CQIE, the process used to initiate the network, the methods of site qualification for positron emission tomography, computed tomography, and magnetic resonance imaging, and the results of the evaluations over the subsequent 3 years.
Introduction
Background and Objectives
Advanced imaging methodologies play a pivotal role in cancer care, providing early detection of tumors and guidance of therapy as well as subsequent disease monitoring and surveillance. Advantages inherent in imaging assays include the ability to obtain spatially localized information over large volumes of tissue or the entire body compared to the limited sampling available for biopsy-driven histopathology or in vitro blood- or serum-based assays and their inherent drawbacks. In addition, in vivo imaging assays have the ability to provide multiple evaluations of a molecular target or tumor metabolism over time, allowing for adaptive therapy without invasive procedures.
Continued progress in research and development of imaging agents, methodologies, and technologies holds promise for better cancer care—for example, with improved tumor detection and biological characterization. New imaging agents and approaches exploit various pathophysiologic characteristics of tumors with evaluations of phenomena such as metabolism, proliferation, hypoxia, angiogenesis, essential signal pathway blockage(s), and other tumor microenvironment modifications. These refined imaging procedures have the potential to be surrogate or primary biomarkers in oncologic patient evaluation. In addition, the use of validated molecular imaging probes is critical both to the National Cancer Institute (NCI) drug discovery and development process and the ongoing NCI commitment to further our understanding of cancer biology.
While imaging for patient care in clinical oncology practice is predominately focused on tumor diagnostics, imaging within oncology clinical trials has expanded to assess tumor biologic characteristics, including pretreatment patient stratification and functional changes during and after therapeutic interventions. This expanded focus on oncologic imaging necessitates a more stringent quality management program to ensure that imaging devices are functioning appropriately and that properly controlled imaging acquisition protocols are being used at radiology sites performing imaging on patients in clinical trials. It is essential that the resulting imaging examinations are of sufficient quality to assess the desired end points, and that the imaging assessment is performed in a consistent manner across sites. Finally, imaging data in clinical trials should be appropriately preserved for central analysis, regulatory documentation, and potential downstream secondary studies. As such, there has been increased recognition of the need to standardize imaging protocols in clinical trials .
With these objectives in mind, the NCI Cancer Imaging Program (CIP) initiated the Centers for Quantitative Imaging Excellence (CQIE) program in 2010 to establish a resource of clinical trial-ready sites within the NCI-designated Cancer Centers (NCI-CCs) network, capable of conducting treatment trials that contain integral molecular and functional advanced quantitative imaging end points. The NCI-CC sites serve as centers for transdisciplinary, translational, and clinical research, and link cancer research to health service delivery systems outside the center via proactive dissemination programs. Such centers were optimal sites in which to support and promote advanced quantitative imaging for measurement of response.
Delays can often occur in opening treatment trials with advanced imaging aims within a multicenter setting. Areas of delay may include site selection based on qualification of advanced imaging capabilities, dissemination of relevant qualification standards for molecular and/or functional imaging modalities, and lack of coordinated collaboration among imaging and treatment/research teams at a site. These areas of concern were addressed by organizing and implementing the CQIE program under the auspices of NCI CIP, with administrative coordination and oversight from the American College of Radiology Imaging Network (ACRIN), NCI’s cooperative group with an exemplary history of performing large phase II and III studies to evaluate imaging methods and agents for enhanced cancer management. This report on CQIE progress details the experience and lessons learned in the course of qualifying approximately 60 NCI-CC sites across the nation; these centers have now demonstrated competence in key areas of advanced imaging within the modalities of static and dynamic positron emission tomography (PET), volumetric computed tomography (vCT) or magnetic resonance (vMR), and dynamic contrast-enhanced MR imaging (DCE-MRI) in body and/or brain. This CQIE network is now a proven resource supporting development and clinical implementation of quantitative imaging for measurement of response to therapy, with the potential to be extended to other NCI and National Institutes of Health programs that support advanced imaging within clinical trials under the NCI Divisions of Cancer Treatment and Diagnosis as well as the Division of Cancer Prevention.
Roles and Responsibilities of the CQIE Partners
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Methods
Site Identification and Contact
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
General Procedures
Year One
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Years Two and Three
Get Radiology Tree app to read full this article<
PET
Sources of Variability
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Site Initiation
Get Radiology Tree app to read full this article<
Clinical Test Cases
Get Radiology Tree app to read full this article<
Phantom Testing
Get Radiology Tree app to read full this article<
TABLE 1
Acceptance Criteria
Body and Brain FOV Static Acquisitions
Body FOV Dynamic Acquisition
FOV, field of view; SUV, standardized uptake value.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
TABLE 2
SUV Acceptance Criteria
Based on ACR’s 2010 Pass/Fail Criteria
SUV, standardized uptake value.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Data Submission and Analysis
Get Radiology Tree app to read full this article<
QC Routine
Get Radiology Tree app to read full this article<
TABLE 3
CQIE-Recommended QC Activities
Test Purpose Frequency Physical inspection Check gantry covers in tunnel and patient handling system. Daily Daily detector check Test and visualize proper functioning of detector modules. Daily Blank scan Visually inspect sinograms for apparent streaks and consistency. Daily Normalization Determine system response to activity inside the FOV. At least 1 × 3 months, after software upgrades and hardware service Uniformity Estimate axial uniformity across image planes by imaging a uniformly filled object. After maintenance, new setups, normalization, and software upgrades Attenuation-correction calibration Determine calibration factor from image voxel intensity to true activity concentration At least 1 × 6 months, after normalization Cross-calibration Identify discrepancies between PET camera and dose calibrator. At least 1 × 3 months, after upgrades, new setups, normalization Spatial resolution Measure spatial resolution of point source in sinogram and image space. At least annually Count Rate performance Measure count rate as a function of given activity concentration. After new setups, normalization, recalibrations Sensitivity Measure volume response of system to a source of given activity concentration. At least 1 × 6 months Image quality Check hot and cold spot image quality of standardized image quality phantom. At least annually
CQIE, Centers for Quantitative Imaging Excellence; FOV, field of view; PET, positron emission tomography; QC, quality control.
Get Radiology Tree app to read full this article<
CT
Phantom Scanning
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
TABLE 4
Volumetric Adult Chest CT Protocol
Parameter GE Philips Siemens Toshiba Display FOV (Reconstruction FOV) 21 cm 210 230 21 cm Reconstructed slice width 1.25 mm 1.25 mm 1–1.5 mm 1–1.5 mm Reconstruction algorithm STD B B30f FC10 Matrix 512 × 512 Scan FOV Small body mAs 240 ± 20 kVp 120 Scan mode Axial
CT, computed tomography; FOV, field of view.
TABLE 5
Acquisition Parameters for Phantom Scans 2 and 3
Protocol Scan FOV Display FOV Slice Width Recon Algorithm Scan Mode 2 Volumetric liver Small body 21–25 cm 2.5–3 mm Per routine clinical protocol Helical (or ~25 cm) 3 Adult abdomen Large 38 cm 5 mm (or ~50 cm)
FOV, fields of view.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Phantom Image Analysis
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
TABLE 6
CT Number Pass Criteria
Material CT Number (HU) Bone +850 to +970 Air −1005 to −970 Acrylic +110 to +135 Water −7 to +7 Polyethylene −170 to −87
CT, computed tomography; HU, Hounsfield unit.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Passing Criteria
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
TABLE 7
Standardized QC Tests for CT
Test Minimum Frequency Water CT number and standard deviation Daily—technologist Artifacts Daily—technologist Scout prescription and alignment light accuracy Annually Imaged slice thickness (slice sensitivity profile, SSP) Annually Table travel/slice positioning accuracy Annually Radiation beam width Annually High-contrast (spatial) resolution Annually Low-contrast sensitivity and resolution Annually Image uniformity and noise Annually CT number accuracy Annually Artifact evaluation Annually Dosimetry/CTDI Annually
CT, computed tomography; QC, quality control; CTDI, Computed Tomography Dose Index; SSP, slice sensitivity profile.
Get Radiology Tree app to read full this article<
MRI
Phantom Scanning and Image Submission
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Image Analysis
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Passing Criteria
Get Radiology Tree app to read full this article<
Quality Control Routine
Get Radiology Tree app to read full this article<
TABLE 8
Standardized QC Tests for MRI
Test Minimum Frequency Center frequency Weekly Table positioning Weekly Signal to noise Weekly Artifact analysis Weekly Geometric accuracy Weekly High-contrast resolution Weekly Low-contrast resolution Weekly Magnetic field homogeneity Quarterly Slice position accuracy Quarterly Slice thickness accuracy Quarterly Radiofrequency coil checks Annually
MRI, magnetic resonance imaging; QC, quality control.
Get Radiology Tree app to read full this article<
Results
Year One
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Year Two
Get Radiology Tree app to read full this article<
TABLE 9
Differences in Scanner Qualification from Year 1–Year 3
Sites Scanners No. of Sites Submitted No. of Sites Passing Percent Sites Passing No. of Scanners Submitted No. of Scanners Passing (First Attempt) Percent Scanners Passing (First Attempt) No. of Scanners Passed Percent Scanners Passed Year 1 CT 58 58 100 73 65 89 69 95 MR 58 58 100 83 69 83 74 89 PET 56 56 100 65 25 38 64 98
Year 2 CT 28 26 93 40 34 85 35 88 MR 31 23 74 48 37 77 41 85 PET 33 32 97 39 33 85 39 100
Year 3 CT 53 44 83 82 66 80 66 80 MR 53 49 92 88 75 85 75 85 PET 51 48 92 52 35 67 48 92
CT, computed tomography; MR, magnetic resonance; PET, positron emission tomography.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Year Three
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Scanner Distribution by Manufacturer
Get Radiology Tree app to read full this article<
TABLE 10
Manufacturers Represented in CQIE Testing Program
Scanner Modality Manufacturer Year 1 Year 2 Year 3 CT GE 24 23 34 Siemens 31 12 7 Philips 12 1 29 Toshiba 6 4 9 MRI GE 28 17 22 Siemens 44 24 13 Philips 11 7 52 Toshiba 0 0 1 PET GE 36 21 28 Siemens 7 4 5 Philips 21 14 20
CQIE, Centers for Quantitative Imaging Excellence; CT, computed tomography; MRI, magnetic resonance imaging; PET, positron emission tomography.
Get Radiology Tree app to read full this article<
Reasons for Scanner Qualification Failure
Get Radiology Tree app to read full this article<
TABLE 11
MRI Year 2 and Year 3 Phantom Study Failure Reasons on First Submission
Series Year 2 Area Failed Number Year 3 Area Failed Number ACR Phantom Imaging—ACR series, T1/T2 SE \* Low-contrast detectability 3 Low-contrast detectability 1 Image uniformity 3 Image uniformity 4 Position accuracy 2 Incomplete submission 1 Acquisition compliance (slice thickness) 1 — — 3D Volumetric series \* Acquisition compliance (incomplete phantom coverage) 1 Incomplete submission 1 DCE body \* Incomplete submission 2 Incomplete submission 4 Acquisition compliance (temporal resolution) 1 Acquisition compliance (temporal resolution) 2 Artifact 2 — — Acquisition compliance (FOV) 1 Acquisition compliance (flip angle) 2 Acquisition compliance (scan duration) 4 — — DCE brain \* Acquisition compliance (temporal resolution) 1 Acquisition compliance (temporal resolution) 1 Acquisition compliance (scan duration) 2 — —
3D, three-dimensional; ACR, American College of Radiology; DCE, dynamic contrast-enhanced; FOV, field of view; MRI, magnetic resonance imaging; T1, T2, relaxation times, SE, spin echo.
Get Radiology Tree app to read full this article<
TABLE 12
CT Year 2 and Year 3 Phantom Study Failure Reasons on First Submission
Series Year 2 Area Failed Number Year 3 Area Failed Number Volumetric lung \* CT no. accuracy 1 CT no. accuracy 2 Incomplete submission 2 Acquisition compliance (FOV) 1 Acquisition compliance (slice interval) 1 Acquisition compliance (slice interval) 3 Volumetric liver \* Low contrast resolution 3 Low-contrast resolution 9 Positioning accuracy/slice prescription 1 CT no. accuracy 3 — — Acquisition compliance (FOV) 1 Abdominal \* Low contrast resolution 3 Low-contrast resolution 8 CT no. accuracy 2 CT no. accuracy 5 — — Positioning accuracy/slice prescription 1 — — Acquisition compliance (FOV) 1
CT, computed tomography; FOV, field of view.
Get Radiology Tree app to read full this article<
TABLE 13
PET Year 2 and Year 3 Phantom Study Failure Reasons on First Submission
Series Year 2 Year 3 Uniform Cylinder PET Phantom \* Static Static Reason # fails (protocol) Reason # fails (protocol) SUV out of specification 2 (brain) SUV out of specification 3 (brain, body) Incomplete submission 1 (body + brain) Incomplete submission 1 (brain, body) Problem with forms 1 (body + brain) Uniformity problem 2 (brain) Improper acquisition 1 (body + brain) — — Dynamic Dynamic SUV out of specification 1 (body) SUV out of specification 5 (body) Reconstruction problem 1 (body) Reconstruction problem 7 (body) Incomplete submission 1 (body) Incomplete submission 1 (body) Improper acquisition 1 (body) — — Problem with forms 1 (body) — — ACR PET Phantom \* SUV out of specification 1 (brain) SUV out of specification 2 (brain, body) Incomplete submission 1 (body, brain) Incomplete submission 3 (brain, body) Improper acquisition 3 (body), 2 (brain) Phantom filling issue 1 (body) — — Problem with forms 2 (body)
ACR, American College of Radiology; PET, positron emission tomography; SUV, standardized uptake value.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Discussion
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
With a clinical trial standard for image acquisition and interpretation, sponsors should address the features highlighted within the subsequent sections of this guidance. These features, including various aspects of data standardization, exceed those typically used in medical practice .
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Acknowledgements
Get Radiology Tree app to read full this article<
References
1. Shankar L.K.: The clinical evaluation of novel imaging methods for cancer management. Nat Rev Clin Oncol 2012; 9: pp. 738-744. Available at http://www.ncbi.nlm.nih.gov/pubmed/23149888
2. Boellaard R., Oyen W.J., Hoekstra C.J., et. al.: The Netherlands protocol for standardisation and quantification of FDG whole body PET studies in multi-centre trials. Eur J Nucl Med Mol Imaging 2008; 35: pp. 2320-2333.
3. Buckler A.J., Boellaard R.: Standardization of quantitative imaging: the time is right, and 18F-FDG PET/CT is a good place to start. J Nucl Med 2011; 52: pp. 171-172.
4. Buckler A.J., Schwartz L.H., Petrick N., et. al.: Data sets for the qualification of volumetric CT as a quantitative imaging biomarker in lung cancer. Opt Express 2010; 18: pp. 15267-15282.
5. Fahey F.H., Kinahan P.E., Doot R.K., et. al.: Variability in PET quantitation within a multicenter consortium. Med Phys 2010; 37: pp. 3660-3666.
6. McCollough C.H., Bruesewitz M.R., McNitt-Gray M.F., et. al.: The phantom portion of the American College of Radiology (ACR) Computed Tomography (CT) accreditation program: practical tips, artifact examples, and pitfalls to avoid. Med Phys 2004; 31: pp. 2423-2442.
7. Clarke G.D.: Overview of the ACR MRI Accreditation Phantom. MRI PHANTOMS & QA TESTING, AAPM Literature; Available at https://www.aapm.org/meetings/99AM/pdf/2728-58500.pdf
8. U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER), Center for Biologics Evaluation and Research (CBER) : Guidance for Industry: Standards for Clinical Trial Imaging Endpoints. Draft Guidance. August; Available at http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM268555.pdf
9. Scheuermann J.S., Saffer J.R., Karp J.S., et. al.: Qualification of PET scanners for use in multicenter cancer clinical trials: the American College of Radiology Imaging Network experience. J Nucl Med 2009; 50: pp. 1187-1193.
10. Trembath L., Opanowski A.: Clinical trials in molecular imaging: the importance of following the protocol. J Nucl Med Technol 2011; 39: pp. 63-69.
11. Kurland B.F., Doot R.K., Linden H.A., et. al.: Multicenter trials using 18F-fluorodeoxyglucose (FDG) PET to predict chemotherapy response: effects of differential measurement error and bias on power calculations for unselected and enrichment designs. Clin Trials 2013; 10: pp. 886-895.
12. Buckler A.J., Bresolin L., Dunnick N.R., et. al.: A collaborative enterprise for multi-stakeholder participation in the advancement of quantitative imaging. Radiology 2011; 258: pp. 906-914.
13. Yankeelov T.E., Mankoff D.A., Schwartz L.H., et. al.: Quantitative imaging in cancer clinical trials. Clin Cancer Res 2016; 22: pp. 284-290.
14. Ioannidis J.P.A.: Why most published research findings are false. PLoS Med 2005; 2: pp. e124.
15. Macleod M.R., Michie S., Roberts I., et. al.: Biomedical research: increasing value, reducing waste. Lancet 2014; 383: pp. 101-104.
16. Doot R.K., Kurland B.F., Kinahan P.E., et. al.: Design considerations for using PET as a response measure in single site and multicenter clinical trials. Acad Radiol 2012; 19: pp. 184-190.
17. Scheuermann J., Opanowski A., Maffei J., et. al.: Qualification of NCI-designated comprehensive cancer centers for quantitative PET/CT imaging in clinical trials. J Nucl Med 2013; 54: Abstract
18. Opanowski A., Kiss T.: Education and scanner validation: keys to standardizing PET imaging research. SNMMI-TS Uptake Newsletter; 19; Available at http://snmmi.files.cms-plus.com/docs/Uptake-SeptOct_2013.pdf
19. European Association of Nuclear Medicine (EANM) : ResEARch 4Life: an EANM initiative. About EARL; Available at http://earl.eanm.org/cms/website.php?id=/en/about_earl.htm Accessed June 27, 2014
20. Fitzgerald T.J., Bishop-Jodoin M., Followill D.S., et. al.: Imaging and data acquisition in clinical trials for radiation therapy. Int J Radiat Oncol Biol Phys 2016; 94: pp. 404-411.
21. Fitzgerald T.J., Bishop-Jodoin M., Bosch W.R., et. al.: Future vision for the quality assurance of oncology clinical trials. Front Oncol 2013; 3: pp. 31.