Survival analysis plays a central role in diverse research fields, especially in health sciences. As an analytical tool, it can be used to help improve patients’ survival time, or at least, reduce the prospects of recurrence in cancer studies. However, approaches to the predictive performance of the current survival models mainly center on clinical data along with the classical survival methods. For censored “omics” data, the performance of survival models has not been thoroughly studied, either often due to their high dimensionality issues or reliance on binarizing the survival time for classification analysis. We aim to present a neural benchmark approach that analyzes and compares a broad range of classical and state-of-the-art machine learning survival models for “omics” and clinical datasets. All the methods considered in our study are evaluated using predictability as a performance measure. The study is systematically designed to make 36 comparisons (9 methods over 4 datasets, i.e., 2 clinical and 2 omics), and shows that, in practice, predictability of survival models does vary across real-world datasets, model choice, as well as the evaluation metric. From our results, we emphasize that performance criteria can play a key role in a balanced assessment of diverse survival models. Moreover, the Multitask Logistic Regression (MTLR) showed remarkable predictability for almost all the datasets. We believe this outstanding performance presents a unique opportunity for a wider use of MTLR for survival risk factors. For translational clinicians and scientists, we hope our findings provide practical guidance for benchmark studies of survival models, as well as highlight potential areas of research interest.
We declare no competing interests.
This work received no financial support in any form.
None
We would like to thank the instructors in the Department of Statistics for their invaluable suggestions and constructive criticisms.
None
Primary Language | English |
---|---|
Subjects | Biostatistics |
Journal Section | Statistics |
Authors | |
Project Number | None |
Early Pub Date | September 30, 2024 |
Publication Date | September 30, 2024 |
Submission Date | June 27, 2024 |
Acceptance Date | July 11, 2024 |
Published in Issue | Year 2024 Volume: 11 Issue: 3 |