View a PDF of the paper titled ReXTrust: A Model for Fine-Grained Hallucination Detection in AI-Generated Radiology Reports, by Romain Hardy and 2 other authors
Abstract:The increasing adoption of AI-generated radiology reports necessitates robust methods for detecting hallucinations–false or unfounded statements that could impact patient care. We present ReXTrust, a novel framework for fine-grained hallucination detection in AI-generated radiology reports. Our approach leverages sequences of hidden states from large vision-language models to produce finding-level hallucination risk scores. We evaluate ReXTrust on a subset of the MIMIC-CXR dataset and demonstrate superior performance compared to existing approaches, achieving an AUROC of 0.8751 across all findings and 0.8963 on clinically significant findings. Our results show that white-box approaches leveraging model hidden states can provide reliable hallucination detection for medical AI systems, potentially improving the safety and reliability of automated radiology reporting.
Submission history
From: Romain Hardy [view email]
[v1]
Tue, 17 Dec 2024 02:07:33 UTC (1,023 KB)
[v2]
Mon, 30 Dec 2024 16:56:25 UTC (1,023 KB)
Source link
#Model #FineGrained #Hallucination #Detection #AIGenerated #Radiology #Reports