open access publication

Article, 2024

Improved automated tumor segmentation in whole-body 3D scans using multi-directional 2D projection-based priors

Heliyon, ISSN 2405-8440, Volume 10, 4, 10.1016/j.heliyon.2024.e26414

Contributors

Tarai S. 0000-0002-5550-3575 (Corresponding author) [1] Lundstrom E. [1] Sjoholm T. [1] Jonsson H. [1] Korenyushkin A. Ahmad N. [1] Pedersen M.A. 0000-0002-8670-939X [2] [3] Molin D. 0000-0002-9646-7283 [1] Enblad G. 0000-0002-0594-724X [1] Strand R. [1] Ahlstrom H. 0000-0002-8701-969X [1] Kullberg J. [1]

Affiliations

  1. [1] Uppsala University
  2. [NORA names: Sweden; Europe, EU; Nordic; OECD];
  3. [2] Aarhus University
  4. [NORA names: AU Aarhus University; University; Denmark; Europe, EU; Nordic; OECD];
  5. [3] Aarhus University Hospital
  6. [NORA names: Central Denmark Region; Hospital; Denmark; Europe, EU; Nordic; OECD]

Abstract

Early cancer detection, guided by whole-body imaging, is important for the overall survival and well-being of the patients. While various computer-assisted systems have been developed to expedite and enhance cancer diagnostics and longitudinal monitoring, the detection and segmentation of tumors, especially from whole-body scans, remain challenging. To address this, we propose a novel end-to-end automated framework that first generates a tumor probability distribution map (TPDM), incorporating prior information about the tumor characteristics (e.g. size, shape, location). Subsequently, the TPDM is integrated with a state-of-the-art 3D segmentation network along with the original PET/CT or PET/MR images. This aims to produce more meaningful tumor segmentation masks compared to using the baseline 3D segmentation network alone. The proposed method was evaluated on three independent cohorts (autoPET, CAR-T, cHL) of images containing different cancer forms, obtained with different imaging modalities, and acquisition parameters and lesions annotated by different experts. The evaluation demonstrated the superiority of our proposed method over the baseline model by significant margins in terms of Dice coefficient, and lesion-wise sensitivity and precision. Many of the extremely small tumor lesions (i.e. the most difficult to segment) were missed by the baseline model but detected by the proposed model without additional false positives, resulting in clinically more relevant assessments. On average, an improvement of 0.0251 (autoPET), 0.144 (CAR-T), and 0.0528 (cHL) in overall Dice was observed. In conclusion, the proposed TPDM-based approach can be integrated with any state-of-the-art 3D UNET with potentially more accurate and robust segmentation results.

Keywords

Backprojection, Deep learning, Maximum intensity projection, Medical image analysis, Segmentation prior, Whole-body tumor segmentation

Funders

  • Cancerfonden
  • Steno Diabetes Center Aarhus
  • Novo Nordisk Fonden
  • Lions Cancer Fund Uppsala and Makarna Eriksson foundation

Data Provider: Elsevier