AUTHOR=Di Piazza Theo , Lazarus Carole , Nempont Olivier , Boussel Loic TITLE=Integrating clinical indications and patient demographics for multilabel abnormality classification and automated report generation in 3D chest CT scans JOURNAL=Frontiers in Radiology VOLUME=Volume 5 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/radiology/articles/10.3389/fradi.2025.1672364 DOI=10.3389/fradi.2025.1672364 ISSN=2673-8740 ABSTRACT=The increasing number of computed tomography (CT) scan examinations and the time-intensive nature of manual analysis necessitate efficient automated methods to assist radiologists in managing their increasing workload. While deep learning approaches primarily classify abnormalities from three-dimensional (3D) CT images, radiologists also incorporate clinical indications and patient demographics, such as age and sex, for diagnosis. This study aims to enhance multilabel abnormality classification and automated report generation by integrating imaging and non-imaging data. We propose a multimodal deep learning model that combines 3D chest CT scans, clinical information reports, patient age, and sex to improve diagnostic accuracy. Our method extracts visual features from 3D volumes using a visual encoder, textual features from clinical indications via a pretrained language model, and demographic features through a lightweight feedforward neural network. These extracted features are projected into a shared representation space, concatenated, and processed by a projection head to predict abnormalities. For the multilabel classification task, incorporating clinical indications and patient demographics into an existing visual encoder, called CT-Net, improves the F1 score to 51.58, representing a +Δ6.13% increase over CT-Net alone. For the automated report generation task, we extend two existing methods, CT2Rep and CT-AGRG, by integrating clinical indications and demographic data. This integration enhances Clinical Efficacy metrics, yielding an F1 score improvement of +Δ14.78% for the CT2Rep extension and +Δ6.69% for the CT-AGRG extension. Our findings suggest that incorporating patient demographics and clinical information into deep learning frameworks can significantly improve automated CT scan analysis. This approach has the potential to enhance radiological workflows and facilitate more comprehensive and accurate abnormality detection in clinical practice.