In:
Journal of the American Medical Informatics Association, Oxford University Press (OUP), Vol. 30, No. 10 ( 2023-09-25), p. 1657-1664
Abstract:
To assess large language models on their ability to accurately infer cancer disease response from free-text radiology reports. Materials and Methods We assembled 10 602 computed tomography reports from cancer patients seen at a single institution. All reports were classified into: no evidence of disease, partial response, stable disease, or progressive disease. We applied transformer models, a bidirectional long short-term memory model, a convolutional neural network model, and conventional machine learning methods to this task. Data augmentation using sentence permutation with consistency loss as well as prompt-based fine-tuning were used on the best-performing models. Models were validated on a hold-out test set and an external validation set based on Response Evaluation Criteria in Solid Tumors (RECIST) classifications. Results The best-performing model was the GatorTron transformer which achieved an accuracy of 0.8916 on the test set and 0.8919 on the RECIST validation set. Data augmentation further improved the accuracy to 0.8976. Prompt-based fine-tuning did not further improve accuracy but was able to reduce the number of training reports to 500 while still achieving good performance. Discussion These models could be used by researchers to derive progression-free survival in large datasets. It may also serve as a decision support tool by providing clinicians an automated second opinion of disease response. Conclusions Large clinical language models demonstrate potential to infer cancer disease response from radiology reports at scale. Data augmentation techniques are useful to further improve performance. Prompt-based fine-tuning can significantly reduce the size of the training dataset.
Type of Medium:
Online Resource
ISSN:
1067-5027
,
1527-974X
DOI:
10.1093/jamia/ocad133
Language:
English
Publisher:
Oxford University Press (OUP)
Publication Date:
2023
detail.hit.zdb_id:
2018371-9
Bookmarklink