留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码
x

基于U-net3+的宫颈癌后装治疗中靶区和危及器官位置的预测

李霞 杨磊 杨日赠 吴德华

李霞, 杨磊, 杨日赠, 吴德华. 基于U-net3+的宫颈癌后装治疗中靶区和危及器官位置的预测[J]. 分子影像学杂志, 2023, 46(3): 448-452. doi: 10.12122/j.issn.1674-4500.2023.03.10
引用本文: 李霞, 杨磊, 杨日赠, 吴德华. 基于U-net3+的宫颈癌后装治疗中靶区和危及器官位置的预测[J]. 分子影像学杂志, 2023, 46(3): 448-452. doi: 10.12122/j.issn.1674-4500.2023.03.10
LI Xia, YANG Lei, YANG Rizeng, WU Dehua. U-net3+ network-based prediction of target and dangerous organ location in cervical cancer after-loading therapy[J]. Journal of Molecular Imaging, 2023, 46(3): 448-452. doi: 10.12122/j.issn.1674-4500.2023.03.10
Citation: LI Xia, YANG Lei, YANG Rizeng, WU Dehua. U-net3+ network-based prediction of target and dangerous organ location in cervical cancer after-loading therapy[J]. Journal of Molecular Imaging, 2023, 46(3): 448-452. doi: 10.12122/j.issn.1674-4500.2023.03.10

基于U-net3+的宫颈癌后装治疗中靶区和危及器官位置的预测

doi: 10.12122/j.issn.1674-4500.2023.03.10
基金项目: 

国家自然科学基金 82272737

详细信息
    作者简介:

    李霞,在读硕士研究生,主管技师,E-mail: 397546868@qq.com

    通讯作者:

    吴德华,博士,教授,主任医师,E-mail: 18602062748@163.com

U-net3+ network-based prediction of target and dangerous organ location in cervical cancer after-loading therapy

Funds: 

National Natural Science Foundation of China 82272737

  • 摘要:   目的  构建基于深度学习方法的宫颈癌后装治疗中高危靶区和危及器官位置预测方法。  方法  构建基于U-net3+的端到端自动分割框架,对两个中心213例已进行后装高剂量率治疗的宫颈癌患者进行勾画,并按照7:2:1的比例分为训练集、验证集和测试集。勾画的内容包括高危临床靶区、膀胱、直肠和小肠,分别用豪斯多夫距离及戴斯相似系数评估预测模型的准确性。  结果  膀胱自动勾画的戴斯相似系数为0.953,直肠、小肠分别为0.885、0.857,危及器官的平均值是0.898,豪斯多夫距离平均为5.4 mm;高危临床靶区戴斯相似系数是0.869,豪斯多夫距离为8.1 mm。  结论  基于U-net3+的宫颈癌后装治疗中靶区和危及器官位置预测模型具有较高的准确率,同时训练耗费时间少,有望在临床进行应用推广。

     

  • 图  1  后装放射治疗结构轮廓图

    Figure  1.  Contouring of the structures of after-loading radiation therapy

    图  2  U-net3+网络框架

    Figure  2.  U-net3+ network framework.

    图  3  损失值随训练次数变化曲线

    Figure  3.  Loss value changing with the times of training.

    A: Bladder; B: Rectum; C: Small intestine; D: HRCTV.

    图  4  自动勾画结果

    Figure  4.  Automatic segmentation results.

    A and E, B and F, C and G, D and H indicated the bladder, rectum, small intestine and HRCTV outline results respectively, while A and E indicated the bladder outline results of case 1 and case 2 respectively, B and F indicated the rectum outline results of case 1 and case 2 respectively, C and G indicated the small intestine outline results of case 1 and case 2 respectively, D and H indicated the HRCTV outline results of case 1 and case 2 respectively. The yellow line indicated the manually outlined contour line, and the red line indicated the automatically outlined result.

    表  1  位置预测准确性结果

    Table  1.   Quantification of the accuracy of automatic segmentation

    Structures DSC (Mean±SD) HD (mm, Mean±SD) Training time (h)
    Bladder 0.953±0.020 5.1±1.6 8
    Rectum 0.885±0.030 5.4±1.5 11
    Small intestine 0.857±0.040 5.7±2.1 12
    HRCTV 0.869±0.030 8.1±2.8 15
    DSC: Dice similarity coefficient; HD: Hausdorff distance; HRCTV: High-risk clinical target.
    下载: 导出CSV
  • [1] Siegel RL, Miller KD, Fuchs HE, et al. Cancer statistics[J]. 2021 CA A Cancer J Clin, 2021, 71(1): 7-33.
    [2] Matsuo K, Machida H, Mandelbaum RS, et al. Validation of the 2018 FIGO cervical cancer staging system[J]. Gynecol Oncol, 2019, 152(1): 87-93. doi: 10.1016/j.ygyno.2018.10.026
    [3] Jauniaux E, Ayres-de-Campos D, Langhoff-Roos J, et al. FIGO classification for the clinical diagnosis of placenta accreta spectrum disorders[J]. Int J Gynaecol Obstet, 2019, 146(1): 20-4. doi: 10.1002/ijgo.12761
    [4] 吴德华. 子宫颈癌放射治疗的规范化[J]. 中国实用妇科与产科杂志, 2021, 37(1): 54-9. doi: 10.19538/j.fk2021010114
    [5] Yang AN, Lu N, Jiang HY, et al. Automatic delineation of organs at risk in non-small cell lung cancer radiotherapy based on deep learning networks[J]. 肿瘤学与转化医学: 英文版, 2022(2): 83-8. https://www.cnki.com.cn/Article/CJFDTOTAL-ZDLZ202303003.htm
    [6] Chino J, Annunziata CM, Beriwal S, et al. Radiation Therapy for Cervical Cancer: Executive Summary of an ASTRO Clinical Practice Guideline[J]. Practical Radiation Oncology, 2020, 10(4): 220-34. doi: 10.1016/j.prro.2020.04.002
    [7] Kleppe A, Skrede OJ, De Raedt S, et al. Designing deep learning studies in cancer diagnostics[J]. Nat Rev Cancer, 2021, 21(3): 199-211. doi: 10.1038/s41568-020-00327-9
    [8] Ramesh AN, Kambhampati C, Monson J, et al. Artificial intelligence in medicine[J]. Ann R Coll Surg Engl, 2004, 86(5): 334-8. doi: 10.1308/147870804290
    [9] Patel PR, Lei Y, Wang T, et al. Learning-based prostate needle position prediction for HDR brachytherapy[J]. Int J Radiat Oncol, 2021, 111(3): e290-1.
    [10] Akufuna E, MwiingaKalusopa V, Chitundu K, et al. Adherence to Radiation Therapy among Cervical Cancer Patients at Cancer Diseases Hospital in Lusaka, Zambia[J]. 生物科学与医学: 英文, 2022 (5): 25-39. https://www.cnki.com.cn/Article/CJFDTOTAL-ZXXY202319004.htm
    [11] 王先良, 罗锐, 黎杰, 等. 一种宫颈癌近距离治疗剂量预测系统: 中国, CN114596934A[P]. 2022-06-07.
    [12] 向艺达, 周剑良, 白雪, 等. 基于全卷积网络U-Net宫颈癌近距离治疗三维剂量分布预测研究[J]. 中华放射肿瘤学杂志, 2022, 31(4): 359-64.
    [13] 陈祥. 宫颈癌三维后装治疗插植针路径优化及施源器改进的剂量学研究[D]. 衡阳: 南华大学, 2019.
    [14] Guckenberger M, Lievens Y, Bouma AB, et al. Characterisation and classification of oligometastatic disease: a European Society for Radiotherapy and Oncology and European Organisation for Research and Treatment of Cancer consensus recommendation[J]. Lancet Oncol, 2020, 21(1): e18-28. doi: 10.1016/S1470-2045(19)30718-1
    [15] Yin XX, Sun L, Fu YH, et al. U- net- based medical image segmentation[J]. J Healthc Eng, 2022, 2022: 1-16.
    [16] Huang HM, Lin LF, Tong RF, et al. UNet 3: a full-scale connected UNet for medical image segmentation[C]. ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020: 1055-9.
    [17] Krithika alias AnbuDevi M, Suganthi K. Review of semantic segmentation of medical images using modified architectures of UNET[J]. Diagnostics, 2022, 12(12): 3064. doi: 10.3390/diagnostics12123064
    [18] Kosarac A, Cep R, Trochta M, et al. Thermal behavior modeling based on BP neural network in keras framework for motorized machine tool spindles[J]. Materials (Basel), 2022, 15(21): 7782. doi: 10.3390/ma15217782
    [19] Grossman CD, Bari BA, Cohen JY. Serotonin neurons modulate learning rate through uncertainty[J]. Curr Biol, 2022, 32(3): 586-99.
    [20] Rino Neto J, Silva FPLD, Chilvarquer I, et al. Hausdorff Distance evaluation of orthodontic accessories'streaking artifacts in 3D model superimposition[J]. Braz Oral Res, 2012, 26(5): 450-6.
    [21] Tanabe Y, Ishida T, Eto H, et al. Evaluation of the correlation between prostatic displacement and rectal deformation using the Dice similarity coefficient of the rectum[J]. Med Dosim, 2019, 44 (4): e39-43.
    [22] Renganathan V. Overview of artificial neural network models in the biomedical domain[J]. Bratislava Med J, 2019, 120(7): 536-40.
    [23] Chiu YC, Zheng SY, Wang LJ, et al. Predicting and characterizing a cancer dependency map of tumors with deep learning[J]. Sci Adv, 2021, 7(34): eabh1275.
    [24] Mayr D, Schmoeckel E, Höhn AK, et al. Aktuelle WHO-klassifikation des weiblichen genitale[J]. Pathologe, 2021, 42(3): 259-69.
    [25] Wang JZ, Lu JY, Qin G, et al. Technical Note: a deep learning-based autosegmentation of rectal tumors in MR images[J]. Med Phys, 2018, 45(6): 2560-4.
    [26] Bera K, Braman N, Gupta A, et al. Predicting cancer outcomes with radiomics and artificial intelligence in radiology[J]. Nat Rev Clin Oncol, 2022, 19(2): 132-46.
  • 加载中
图(4) / 表(1)
计量
  • 文章访问数:  108
  • HTML全文浏览量:  111
  • PDF下载量:  19
  • 被引次数: 0
出版历程
  • 收稿日期:  2023-02-28
  • 网络出版日期:  2023-06-15
  • 刊出日期:  2023-05-20

目录

    /

    返回文章
    返回

    关于《分子影像学杂志》变更刊期通知

    各位专家、作者、读者:

    为了缩短出版时滞,促进科研成果的快速传播,我刊自2024年1月起,刊期由双月刊变更为月刊。本刊主要栏目有:基础研究、临床研究、技术方法、综述等。

    感谢各位专家、作者、读者长期以来对我刊的支持与厚爱!

    南方医科大学学报编辑部

    《分子影像学杂志》

    2023年12月27日