Breast Cancer Classification on Ultrasound Images Using DenseNet Framework with Attention Mechanism
Breast Cancer Classification on Ultrasound Images Using DenseNet
Abstract
Breast cancer is one of the most prevalent and life-threatening diseases among women worldwide. Early detection of breast cancer being critical for increasing survival rates. Ultrasound image is commonly used for breast cancer screening due to its non-invasive, safe, and cost-effective. However, ultrasound images are often of low quality and have significant noise, which can hinder the effectiveness of classification models. This study proposes an enhanced breast cancer classification model that leverages transfer learning in combination with attention mechanisms to improve diagnostic performance. The main contribution of this research is the introduction of Dense-SASE, a novel architecture that combines DenseNet-121 with two powerful attention modules: Scaled-Dot Product Attention and Squeeze-and-Excitation (SE) Block. These mechanisms are integrated to improve feature representation and allow the model to focus on the most relevant regions of the ultrasound images. The proposed method was evaluated on a publicly available breast ultrasound image dataset, with classification performed across three categories: normal, benign, and malignant. Experimental results demonstrate that the Dense-SASE model achieves an accuracy of 98.29%, a precision of 97.97%, a recall of 98.98%, and an F1-score of 98.44%. Additionally, Grad-CAM visualizations demonstrated the model's capability to localize lesion areas effectively, avoiding non-informative regions, and confirming the model's interpretability. In conclusion, the Dense-SASE model significantly improves the accuracy and reliability of breast cancer classification in ultrasound images. By effectively learning and focusing on clinically relevant features, this approach offers a promising solution for computer-aided diagnosis (CAD) systems and has the potential to assist radiologists in early and accurate breast cancer detection.
Downloads
References
International Agency for Research on Cancer (IARC), “Global Cancer Observatory: Cancer Today,” https://gco.iarc.fr/today/en/dataviz/pie?mode=cancer&group_populations=1&cancers=39&types=0.
P. B. Gordon, “The Impact of Dense Breasts on the Stage of Breast Cancer at Diagnosis: A Review and Options for Supplemental Screening,” Current Oncology, vol. 29, no. 5, pp. 3595–3636, May 2022, doi: 10.3390/curroncol29050291.
M. Rawashdeh et al., “Breast density awareness and cancer risk in the UAE: Enhancing Women’s engagement in early detection,” Radiography, vol. 31, no. 1, pp. 350–358, Jan. 2025, doi: 10.1016/j.radi.2024.12.012.
Z. He et al., “A review on methods for diagnosis of breast cancer cells and tissues,” Cell Prolif, vol. 53, no. 7, Jul. 2020, doi: 10.1111/cpr.12822.
W. A. Berg, “Reducing Unnecessary Biopsy and Follow-up of Benign Cystic Breast Lesions,” Radiology, vol. 295, no. 1, pp. 52–53, Apr. 2020, doi: 10.1148/radiol.2020200037.
S. A. Alshoabi, A. A. Alareqi, F. H. Alhazmi, A. A. Qurashi, A. M. Omer, and A. M. Hamid, “Utility of Ultrasound Imaging Features in Diagnosis of Breast Cancer,” Cureus, Apr. 2023, doi: 10.7759/cureus.37691.
R. Iacob et al., “Evaluating the Role of Breast Ultrasound in Early Detection of Breast Cancer in Low- and Middle-Income Countries: A Comprehensive Narrative Review,” Bioengineering, vol. 11, no. 3, p. 262, Mar. 2024, doi: 10.3390/bioengineering11030262.
A. A. Bhatt, D. H. Whaley, and C. U. Lee, “Ultrasound‐Guided Breast Biopsies,” Journal of Ultrasound in Medicine, vol. 40, no. 7, pp. 1427–1443, Jul. 2021, doi: 10.1002/jum.15517.
O. Díaz, A. Rodríguez-Ruíz, and I. Sechopoulos, “Artificial Intelligence for breast cancer detection: Technology, challenges, and prospects,” Eur J Radiol, vol. 175, p. 111457, Jun. 2024, doi: 10.1016/j.ejrad.2024.111457.
C. Trepanier, A. Huang, M. Liu, and R. Ha, “Emerging uses of artificial intelligence in breast and axillary ultrasound,” Clin Imaging, vol. 100, pp. 64–68, Aug. 2023, doi: 10.1016/j.clinimag.2023.05.007.
J. Egger et al., “Medical deep learning—A systematic meta-review,” Comput Methods Programs Biomed, vol. 221, p. 106874, Jun. 2022, doi: 10.1016/j.cmpb.2022.106874.
M. Chaieb, M. Azzouz, M. Ben Refifa, and M. Fraj, “Deep learning-driven prediction in healthcare systems: Applying advanced CNNs for enhanced breast cancer detection,” Comput Biol Med, vol. 189, p. 109858, May 2025, doi: 10.1016/j.compbiomed.2025.109858.
L. Alzubaidi et al., “Review of deep learning: concepts, CNN architectures, challenges, applications, future directions,” J Big Data, vol. 8, no. 1, p. 53, Mar. 2021, doi: 10.1186/s40537-021-00444-8.
Y. Wang, E. J. Choi, Y. Choi, H. Zhang, G. Y. Jin, and S.-B. Ko, “Breast Cancer Classification in Automated Breast Ultrasound Using Multiview Convolutional Neural Network with Transfer Learning,” Ultrasound Med Biol, vol. 46, no. 5, pp. 1119–1132, May 2020, doi: 10.1016/j.ultrasmedbio.2020.01.001.
T. Choudhary, V. Mishra, A. Goswami, and J. Sarangapani, “A transfer learning with structured filter pruning approach for improved breast cancer classification on point-of-care devices,” Comput Biol Med, vol. 134, p. 104432, Jul. 2021, doi: 10.1016/j.compbiomed.2021.104432.
Z. Cao, L. Duan, G. Yang, T. Yue, and Q. Chen, “An experimental study on breast lesion detection and classification from ultrasound images using deep learning architectures,” BMC Med Imaging, vol. 19, no. 1, p. 51, Dec. 2019, doi: 10.1186/s12880-019-0349-x.
T. Zhou, X. Ye, H. Lu, X. Zheng, S. Qiu, and Y. Liu, “Dense Convolutional Network and Its Application in Medical Image Analysis,” Biomed Res Int, vol. 2022, no. 1, Jan. 2022, doi: 10.1155/2022/2384830.
X. Li et al., “Deep Learning Attention Mechanism in Medical Image Analysis: Basics and Beyonds,” International Journal of Network Dynamics and Intelligence, pp. 93–116, Mar. 2023, doi: 10.53941/ijndi0201006.
J. Li, J. Shi, J. Chen, Z. Du, and L. Huang, “Self-attention random forest for breast cancer image classification,” Front Oncol, vol. 13, Feb. 2023, doi: 10.3389/fonc.2023.1043463.
Z. Li, L. Yuan, H. Xu, R. Cheng, and X. Wen, “Deep Multi-Instance Learning with Induced Self-Attention for Medical Image Classification,” in 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), IEEE, Dec. 2020, pp. 446–450. doi: 10.1109/BIBM49941.2020.9313518.
J. Deng, Y. Ma, D. Li, J. Zhao, Y. Liu, and H. Zhang, “Classification of breast density categories based on SE-Attention neural networks,” Comput Methods Programs Biomed, vol. 193, p. 105489, Sep. 2020, doi: 10.1016/j.cmpb.2020.105489.
J. Hu, L. Shen, and G. Sun, “Squeeze-and-Excitation Networks,” in 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, IEEE, Jun. 2018, pp. 7132–7141. doi: 10.1109/CVPR.2018.00745.
K. Fukitani et al., “3D object detection using improved PointRCNN,” Cognitive Robotics, vol. 2, pp. 242–254, 2022, doi: 10.1016/j.cogr.2022.12.001.
X. Zhang et al., “SERNet: Squeeze and Excitation Residual Network for Semantic Segmentation of High-Resolution Remote Sensing Images,” Remote Sens (Basel), vol. 14, no. 19, p. 4770, Sep. 2022, doi: 10.3390/rs14194770.
K. Munishamaiaha et al., “Robust Spatial–Spectral Squeeze–Excitation AdaBound Dense Network (SE-AB-Densenet) for Hyperspectral Image Classification,” Sensors, vol. 22, no. 9, p. 3229, Apr. 2022, doi: 10.3390/s22093229.
W. K. Moon, Y.-W. Lee, H.-H. Ke, S. H. Lee, C.-S. Huang, and R.-F. Chang, “Computer‐aided diagnosis of breast ultrasound images using ensemble learning from convolutional neural networks,” Comput Methods Programs Biomed, vol. 190, p. 105361, Jul. 2020, doi: 10.1016/j.cmpb.2020.105361.
W. Al-Dhabyani, M. Gomaa, H. Khaled, and A. Fahmy, “Dataset of breast ultrasound images,” Data Brief, vol. 28, p. 104863, Feb. 2020, doi: 10.1016/j.dib.2019.104863.
O. Ronneberger, P. Fischer, and T. Brox, “U-Net: Convolutional Networks for Biomedical Image Segmentation,” 2015, pp. 234–241. doi: 10.1007/978-3-319-24574-4_28.
G. Huang, Z. Liu, L. Van Der Maaten, and K. Q. Weinberger, “Densely Connected Convolutional Networks,” in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Jul. 2017, pp. 2261–2269. doi: 10.1109/CVPR.2017.243.
A. Vaswani et al., “Attention is all you need,” in Proceedings of the 31st International Conference on Neural Information Processing Systems, in NIPS’17. Red Hook, NY, USA: Curran Associates Inc., 2017, pp. 6000–6010.
V. Nguyen, “Bayesian Optimization for Accelerating Hyper-Parameter Tuning,” in 2019 IEEE Second International Conference on Artificial Intelligence and Knowledge Engineering (AIKE), IEEE, Jun. 2019, pp. 302–305. doi: 10.1109/AIKE.2019.00060.
I. Markoulidakis, I. Rallis, I. Georgoulas, G. Kopsiaftis, A. Doulamis, and N. Doulamis, “Multiclass Confusion Matrix Reduction Method and Its Application on Net Promoter Score Classification Problem,” Technologies (Basel), vol. 9, no. 4, p. 81, Nov. 2021, doi: 10.3390/technologies9040081.
A. Tharwat, “Classification assessment methods,” Applied Computing and Informatics, vol. 17, no. 1, pp. 168–192, Jan. 2021, doi: 10.1016/j.aci.2018.08.003.
R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh, and D. Batra, “Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization,” Int J Comput Vis, vol. 128, no. 2, pp. 336–359, Feb. 2020, doi: 10.1007/s11263-019-01228-7.
R. Karthik, R. Menaka, G. S. Kathiresan, M. Anirudh, and M. Nagharjun, “Gaussian Dropout Based Stacked Ensemble CNN for Classification of Breast Tumor in Ultrasound Images,” IRBM, vol. 43, no. 6, pp. 715–733, Dec. 2022, doi: 10.1016/j.irbm.2021.10.002.
S. Armoogum, K. Motean, D. A. Dewi, T. B. Kurniawan, and J. Kijsomporn, “Breast Cancer Prediction Using Transfer Learning-Based Classification Model,” Emerging Science Journal, vol. 8, no. 6, pp. 2373–2384, Dec. 2024, doi: 10.28991/ESJ-2024-08-06-014.
M. R. Islam et al., “Enhancing breast cancer segmentation and classification: An Ensemble Deep Convolutional Neural Network and U-net approach on ultrasound images,” Machine Learning with Applications, vol. 16, p. 100555, Jun. 2024, doi: 10.1016/j.mlwa.2024.100555.
Copyright (c) 2025 Wiharto, Hanina Nafisa Azka, Esti Suryani

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-ShareAlikel 4.0 International (CC BY-SA 4.0) that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).