Qu Xiaolei
Personal Homepage
Paper
Current location: Home >> Paper
An attention‐supervised full‐resolution residual network for the segmentation of breast ultrasound images
Hits:

Impact Factor:4.506

DOI number:10.1002/mp.14470

Journal:Medical Physics

Abstract:Purpose: Breast cancer is the most common cancer among women worldwide. Medical ultrasound imaging is one of the widely applied breast imaging methods for breast tumors. Automatic breast ultrasound (BUS) image segmentation can measure the size of tumors objectively. However, various ultrasound artifacts hinder segmentation. We proposed an attention-supervised full-resolution residual network (ASFRRN) to segment tumors from BUS images. Methods: In the proposed method, Global Attention Upsample (GAU) and deep supervision were introduced into a full-resolution residual network (FRRN), where GAU learns to merge features at different levels with attention for deep supervision. Two datasets were employed for evaluation. One (Dataset A) consisted of 163 BUS images with tumors (53 malignant and 110 benign) from UDIAT Centre Diagnostic, and the other (Dataset B) included 980 BUS images with tumors (595 malignant and 385 benign) from the Sun Yat-sen University Cancer Center. The tumors from both datasets were manually segmented by medical doctors. For evaluation, the Dice coefficient (Dice), Jaccard similarity coefficient (JSC), and F1 score were calculated. Results: For Dataset A, the proposed method achieved higher Dice (84.3±10.0%), JSC (75.2±10.7%), and F1 score (84.3±10.0%) than the previous best method: FRRN. For Dataset B, the proposed method also achieved higher Dice (90.7±13.0%), JSC (83.7±14.8%), and F1 score (90.7±13.0%) than the previous best methods: DeepLabv3 and dual attention network (DANet). For Dataset A+B, the proposed method achieved higher Dice (90.5±13.1%), JSC (83.3±14.8%), and F1 score (90.5±13.1%) than the previous best method: DeepLabv3. Additionally, the parameter number of ASFRRN was only 10.6 M, which is less than those of DANet (71.4 M) and DeepLabv3 (41.3 M). Conclusions: We proposed ASFRRN, which combined with FRRN, attention mechanism, and deep supervision to segment tumors from BUS images. It achieved high segmentation accuracy with a reduced parameter number.

Indexed by:Journal paper

Translation or Not:no

Date of Publication:2020-09-22

Included Journals:SCI

Attachments:

Personal information

Supervisor of Doctorate Candidates
Supervisor of Master's Candidates

E-Mail:

Date of Employment:2017-05-01

School/Department:School of Instrumentation and Optoelectronic Engineering

Administrative Position:Vice Dean of Department

Business Address:New building B504, School of Instrumentation and Optoelectronic Engineering, Beihang University

Gender:Male

Contact Information:quxiaolei@gmail.com

Status:Employed

Academic Titles:Associate professor

Alma Mater:the University of Tokyo

Discipline:Instrumentation Science and Technology

邮箱 :

Honors and Titles:

教育部课程思政示范课“传感器技术及应用”(排6)  2021

北航教学优秀奖二等奖  2021

北航优秀教学成果奖一等奖(排12)  2021

北航优秀教学成果二等奖(排4)  

北航优秀教学成果奖三等奖(排3)  2020

You are visitors

The Last Update Time : ..


Copyright © 2022 Beihang University. All rights reserved.
Address: 37 Xueyuan Road, Haidian District, Beijing, P.R. China, 100191.
Tel: +86-10-82317114

MOBILE Version