TY - JOUR
T1 - Automated Registration for Dual-View X-Ray Mammography Using Convolutional Neural Networks
AU - Walton, William C.
AU - Kim, Seung Jun
AU - Mullen, Lisa A.
N1 - Publisher Copyright:
© 1964-2012 IEEE.
PY - 2022/11/1
Y1 - 2022/11/1
N2 - Objective: Automated registration algorithms for a pair of 2D X-ray mammographic images taken from two standard imaging angles, namely, the craniocaudal (CC) and the mediolateral oblique (MLO) views, are developed. Methods: A fully convolutional neural network, a type of convolutional neural network (CNN), is employed to generate a pixel-level deformation field, which provides a mapping between masses in the two views. Novel distance-based regularization is employed, which contributes significantly to the performance. Results: The developed techniques are tested using real 2D mammographic images, slices from real 3D mammographic images, and synthetic mammographic images. Architectural variations of the neural network are investigated and the performance is characterized from various aspects including image resolution, breast density, lesion size, lesion subtlety, and lesion Breast Imaging-Reporting and Data System (BI-RADS) category. Our network outperformed the state-of-the-art CNN-based and non-CNN-based registration techniques, and showed robust performance across various tissue/lesion characteristics. Conclusion: The proposed methods provide a useful automated tool for co-locating lesions between the CC and MLO views even in challenging cases. Significance: Our methods can aid clinicians to establish lesion correspondence quickly and accurately in the dual-view X-ray mammography, improving diagnostic capability.
AB - Objective: Automated registration algorithms for a pair of 2D X-ray mammographic images taken from two standard imaging angles, namely, the craniocaudal (CC) and the mediolateral oblique (MLO) views, are developed. Methods: A fully convolutional neural network, a type of convolutional neural network (CNN), is employed to generate a pixel-level deformation field, which provides a mapping between masses in the two views. Novel distance-based regularization is employed, which contributes significantly to the performance. Results: The developed techniques are tested using real 2D mammographic images, slices from real 3D mammographic images, and synthetic mammographic images. Architectural variations of the neural network are investigated and the performance is characterized from various aspects including image resolution, breast density, lesion size, lesion subtlety, and lesion Breast Imaging-Reporting and Data System (BI-RADS) category. Our network outperformed the state-of-the-art CNN-based and non-CNN-based registration techniques, and showed robust performance across various tissue/lesion characteristics. Conclusion: The proposed methods provide a useful automated tool for co-locating lesions between the CC and MLO views even in challenging cases. Significance: Our methods can aid clinicians to establish lesion correspondence quickly and accurately in the dual-view X-ray mammography, improving diagnostic capability.
KW - Convolutional neural network
KW - X-ray
KW - image registration
KW - lesion correspondence
KW - mammography
UR - http://www.scopus.com/inward/record.url?scp=85132523669&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85132523669&partnerID=8YFLogxK
U2 - 10.1109/TBME.2022.3173182
DO - 10.1109/TBME.2022.3173182
M3 - Article
C2 - 35522630
AN - SCOPUS:85132523669
SN - 0018-9294
VL - 69
SP - 3538
EP - 3550
JO - IEEE Transactions on Biomedical Engineering
JF - IEEE Transactions on Biomedical Engineering
IS - 11
ER -