Intraoperative guidance of orthopaedic instruments using 3D correspondence of 2D object instance segmentations

I. Bataeva, K. Shah, R. Vijayan, R. Han, N. M. Sheth, G. Kleinszig, S. Vogt, G. M. Osgood, J. H. Siewerdsen, A. Uneri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Purpose. Surgical placement of pelvic instrumentation is challenged by complex anatomy and narrow bone corridors, and despite heavy reliance on intraoperative fluoroscopy, trauma surgery lacks a reliable solution for 3D surgical navigation that is compatible with steep workflow requirements. We report a method that uses routinely acquired fluoroscopic images in standard workflow to automatically detect and localize orthopaedic instruments for 3D guidance. Methods. The proposed method detects, establishes correspondence of, and localizes orthopaedic devices from a pair of radiographs. Instrument detection uses Mask R-CNN for segmentation and keypoint detection, trained on 4000 cadaveric pelvic radiographs with simulated guidewires. Keypoints on individual images are corresponded using prior calibration of the imaging system to backproject, identify, and rank-order ray intersections. Estimation of 3D instrument tip and direction was evaluated on a cadaveric specimen and patient images from an IRB-approved clinical study. Results. The detection network successfully generalized to cadaver and clinical images, achieving 87% recall and 98% precision. Mean geometric accuracy in estimating instrument tip and direction was (1.9 ± 1.6) mm and (1.8 ± 1.3)°, respectively. Simulation studies demonstrated 1.1 mm median error in 3D tip and 2.3° in 3D direction estimation. Preliminary tests in cadaver and clinical images show the basic feasibility of the overall approach. Conclusions. Experimental studies demonstrate the feasibility and highlight the potential of deep learning for 3D-2D registration of orthopaedic instruments as applied in fixation of pelvic fractures. The approach is compatible with routine orthopaedic workflow, does not require additional equipment (such as surgical trackers), uses common imaging equipment (mobile C-arm fluoroscopy), and does not require vendor-specific device models.

Original languageEnglish (US)
Title of host publicationMedical Imaging 2021
Subtitle of host publicationImage-Guided Procedures, Robotic Interventions, and Modeling
EditorsCristian A. Linte, Jeffrey H. Siewerdsen
PublisherSPIE
ISBN (Electronic)9781510640252
DOIs
StatePublished - 2021
EventMedical Imaging 2021: Image-Guided Procedures, Robotic Interventions, and Modeling - Virtual, Online
Duration: Feb 15 2021Feb 19 2021

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume11598
ISSN (Print)1605-7422

Conference

ConferenceMedical Imaging 2021: Image-Guided Procedures, Robotic Interventions, and Modeling
CityVirtual, Online
Period2/15/212/19/21

Keywords

  • 3D-2D image registration
  • Deep learning
  • Image-guided surgery
  • Instance segmentation
  • Intraoperative imaging
  • Object detection

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Atomic and Molecular Physics, and Optics
  • Biomaterials
  • Radiology Nuclear Medicine and imaging

Fingerprint

Dive into the research topics of 'Intraoperative guidance of orthopaedic instruments using 3D correspondence of 2D object instance segmentations'. Together they form a unique fingerprint.

Cite this