AUTHOR=Wang Ye , Yang Ying , Wu Xiaohong , Feng Zhoushan , Wang Congcong TITLE=Rectal cancer segmentation via HHF-SAM: a hierarchical hypercolumn-guided fusion segment anything model JOURNAL=Frontiers in Artificial Intelligence VOLUME=Volume 8 - 2025 YEAR=2026 URL=https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2025.1696984 DOI=10.3389/frai.2025.1696984 ISSN=2624-8212 ABSTRACT=IntroductionRectal cancer is a globally prevalent cancer, and accurate segmentation of rectal lesions in abdominal CT images is critical for clinical diagnosis and treatment planning. Existing methods struggle with imprecise boundary delineation due to low tissue contrast, image noise, and varied lesion sizes, prompting the development of a specialized segmentation framework.MethodsWe developed the Hierarchical Hypercolumn-guided Fusion Segment Anything Model (HHF-SAM) with three core components: 1) A Med-Adapter SAM Encoder integrating LoRA and Adapter modules to adapt SAM's natural image understanding capability to medical-specific features; 2) A Multi-scale Hypercolumn Processing Module to capture comprehensive features for lesions of varying sizes and shapes; 3) A Progressive Hierarchical Fusion Decoder with Hierarchical Fusion Module to aggregate multi-scale features and resolve boundary blurring. The model was evaluated on two public abdominal CT datasets (CARE and WORD) using mean Dice coefficient (mDice) and mean Intersection over Union (mIoU) as metrics.ResultsOn the CARE dataset, HHF-SAM achieved a mean mDice of 74.05% and mean mIoU of 58.96%, outperforming state-of-the-art methods (U-SAM: 69.28% mDice, 53.11% mIoU; SAM: 65.98% mDice, 49.44% mIoU). For tumor segmentation specifically, it reached 76.42% mDice and 62.03% mIoU. On the WORD dataset, it achieved an average mDice of 85.84% across all organs, with 83.24% mDice for rectal segmentation (surpassing U-SAM's 80.66% and SAM's 72.77%).DiscussionThis study presents an SAM-based framework optimized for the unique characteristics of abdominal CT images, effectively overcoming the limitations of general segmentation models in medical image processing. The proposed HHF-SAM provides a reliable tool for clinical auxiliary diagnosis, reducing inter-reader variability and improving efficiency in lesion delineation.