AUTHOR=Ahmed Hosameldin O. A. , Nandi Asoke K. TITLE=ASG-MammoNet: an attention-guided framework for streamlined and interpretable breast cancer classification from mammograms JOURNAL=Frontiers in Signal Processing VOLUME=Volume 5 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/signal-processing/articles/10.3389/frsip.2025.1672569 DOI=10.3389/frsip.2025.1672569 ISSN=2673-8198 ABSTRACT=IntroductionBreast cancer remains the most frequently diagnosed cancer and a leading cause of cancer-related death among women globally, emphasising the urgent need for early, accurate, and interpretable diagnostic tools. While digital mammography serves as the cornerstone of breast cancer screening, its diagnostic performance is often hindered by image quality variability, dense breast tissue, and limited visual interpretability. Furthermore, conventional Computer-Aided Diagnostic (CAD) systems and deep learning models have struggled with clinical adoption due to high false-positive rates, difficult decision-making, and excessive computational demands.MethodsTo address these critical challenges, we introduce ASG-MammoNet, an Attention-Guided and Streamlined deep learning framework for robust, real-time, and explainable mammographic breast cancer classification. The framework is composed of three integrated stages: (1) Data Preparation and Balanced Feature Representation, which applies advanced preprocessing, augmentation, and weighted sampling to mitigate data imbalance and variations across the dataset; (2) Attention-Guided Streamlined Classification, where an EfficientNet-B0 backbone is enhanced by a dual-stage Convolutional Block Attention Module (CBAM) to selectively emphasise diagnostically relevant features; and (3) Explainable Inference, in which Gradient-weighted Class Activation Mapping (Grad-CAM) is employed to provide class-specific visualisations of lesion regions, supporting interpretability and clinical decision-making.ResultsASG-MammoNet is thoroughly validated on three benchmark mammography datasets, CBIS-DDSM, INbreast, and MIAS, achieving accuracy above 99.1%, AUC scores exceeding 99.6%, and DIP (Distance from Ideal Position) scores above 0.99, with an average inference time under 14 milliseconds per image. The framework exhibits strong generalisability, consistent performance across data folds, and clinically relevant attention maps, highlighting its readiness for real-world deployment.DiscussionThe model consistently outperforms or matches recent state-of-the-art approaches while offering superior balance across sensitivity and specificity. Its robust generalisability, consistent fold-wise performance, and clinically meaningful attention visualisations support its practical utility. By addressing critical limitations such as high computational cost, limited interpretability, and precision, ASG-MammoNet represents a practical and reliable solution for AI-assisted breast cancer diagnosis in modern screening settings.