Feasibility of planning-CT-free rapid workflow for stereotactic body radiotherapy: Removing the need for planning CT by AI-driven, intelligent prediction of body deformation

Hamed Hooshangnejad, Kai Ding

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Purpose: Radiotherapy (RT) is widely used for cancer management. The standard RT clinical workflow imposes substantial burdens on patients. Multiple image acquisitions for diagnosis and RT planning increase the travel, cost, and wait time before actual RT treatment, which is critical for cancer patients. Diagnostic CT (dCT) is shown to be suitable for RT planning, however different planning CT (pCT) acquisition setup (e.g., table curvature) and motion management procedure (e.g., deep inspiration/active breath control) makes it infeasible. In this study, we present the feasibility of a fully automatic image adaptation method to omit the need for pCT and expedite the treatment course at institutions where on-table adaptive radiotherapy is available. Methods: We designed a 3D convolutional neural network (3DCNN) to perform the dCT to pCT image adaptation. An automatic table removal algorithm was developed to isolate the patient's body without altering the curvature. The patient body isolated dCT and pCT pair were, then, deformably registered (rCT). The Network was trained on pCTs and the 3D displacement fields (DF). We evaluated the performance of our method using the root averaged sum of squared differences (RASSD), and body contour similarity using Dice similarity coefficient (DSC) and Hausdorff distance (HD). Results: The generated DFs were applied to dCT to generate the synthetic pCTs (sCT). For test cases, the difference of RASSDs, DSCs, and HD between (rCT-vs-pCT) and (sCT-vs-pCT) were 6 HU, 0.02 and 0.3 mm, respectively. Conclusion: A novel 3DCNN based method was presented to adapt the dCT to pCT, omitting the need for acquiring multiple scans before treatment delivery, reducing the cost and length of the RT treatment pathway.

Original languageEnglish (US)
Title of host publicationMedical Imaging 2022
Subtitle of host publicationImage-Guided Procedures, Robotic Interventions, and Modeling
EditorsCristian A. Linte, Jeffrey H. Siewerdsen
PublisherSPIE
ISBN (Electronic)9781510649439
DOIs
StatePublished - 2022
EventMedical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling - Virtual, Online
Duration: Mar 21 2022Mar 27 2022

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume12034
ISSN (Print)1605-7422

Conference

ConferenceMedical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling
CityVirtual, Online
Period3/21/223/27/22

Keywords

  • Computed Tomography Synthesis
  • Convolutional Neural Network
  • Deep Learning
  • Rapid Radiation Therapy Workflow

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Atomic and Molecular Physics, and Optics
  • Biomaterials
  • Radiology Nuclear Medicine and imaging

Fingerprint

Dive into the research topics of 'Feasibility of planning-CT-free rapid workflow for stereotactic body radiotherapy: Removing the need for planning CT by AI-driven, intelligent prediction of body deformation'. Together they form a unique fingerprint.

Cite this